var/home/core/zuul-output/0000755000175000017500000000000015137724761014542 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015137727403015502 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000216152315137727327020276 0ustar corecore׮ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB >kYI_翪|mvşo#oVݏKf+ovpZjl>?xI[mEy},fۮWe~7Nû/wb~1;ZxsY~ݳ( 2[$7۫j{Zw鶾z?&~|XLXlN_/:oXx$%X"LADA@@tkޕf{5Wbx=@^J})K3x~JkwI|YowS˷j̶֛]/8 N Rm(of`\r\L>{Jm 0{vR̍>dQQ.aLk~g\UlxDJfw6xi1U2 c#FD?2SgafO3|,ejoLR3[ D HJP1Ub2i]$HU^L_cZ_:F9TJJ{,mvgL;: ԓ$a;ɾ7lַ;̵3](uX|&kΆ2fb4NvS)f$UX dcю)""û5h< #чOɁ^˺b}0w8_jiB8.^s?Hs,&,#zd4XBu!.F"`a"BD) ᧁQZ-D\h]Q!]Z8HGU=y&|'oZƧe7ΣԟRxxXԨkJ[8 ";ЗH F=y܇sθm@%*'9qvD]9X&;cɻs0I٘]_fy tt('/V/TB/ap+V9g%$P[4D2L'1bЛ]\s΍ic-ܕ4+ޥ^.w[A9/vb֜}>| TXNrdTs>RDPhإek-*듌D[5l2_nH[׫yTNʹ<ws~^B.Ǔg'AS'E`hmsJU # DuT%ZPt_WďPv`9 C|mRj)CMitmu׀svRڡc0SAA\c}or|MKrO] g"tta[I!;c%6$V<[+*J:AI \:-rR b B"~?4 W4B3lLRD|@Kfځ9g ? j럚Sř>]uw`C}-{C):fUr6v`mSΟ1c/n߭!'Y|7#RI)X)yCBoX^P\Ja 79clw/H tBFKskޒ1,%$BվCh,xɦS7PKi0>,A==lM9Ɍm4ެ˧jOC d-saܺCY "D^&M){ߘ>:i V4nQi1h$Zb)ŠȃAݢCj|<~cQ7Q!q/pCTSqQyN,QEFKBmw&X(q8e&щu##Ct9Btka7v Ө⸇N~AE6xd~?D ^`wC4na~Uc)(l fJw>]cNdusmUSTYh>Eeք DKiPo`3 aezH5^n(}+~hX(d#iI@YUXPKL:3LVY~,nbW;W8QufiŒSq3<uqMQhiae̱F+,~Mn3 09WAu@>4Cr+N\9fǶy{0$Swwu,4iL%8nFВFL2#h5+C:D6A@5D!p=T,ښVcX㯡`2\fIԖ{[R:+I:6&&{Ldrǒ*!;[tʡP=_RFZx[|mi ǿ/&GioWiO[BdG.*)Ym<`-RAJLڈ}D1ykd7"/6sF%%´ƭ*( :xB_2YKoSrm_7dPΣ|ͣn/𚃚p9w#z A7yTJ$KOL-aP+;;%+_6'Sr|@2nQ{aK|bjܒ^o(מO80$QxBcXE ء\G=~j{Mܚ: hLT!uP_T{G7C]Ch',ެJG~Jc{xt zܳ'鮱iX%x/QOݸ}S^vv^2M!.xR0I(կѶO:#'6RE'E3 */HAYk|z|ءPQgOJӚ:ƞŵ׉5'{#ޢ1c qw zǽ0 2mK:ȔsGdurWMF*֢v|EC#{usSMiI S/jﴍ8wPVC P2EU:F4!ʢlQHZ9E CBU)Y(S8)c yO[E}Lc&ld\{ELO3芷AgX*;RgXGdCgX JgX2*Ъ3:O7ǭ3ږA :}d,ZByXϯ&Ksg3["66hŢFD&iQCFd4%h= z{tKmdߟ9i {A.:Mw~^`X\u6|6rcIF3b9O:j 2IN…D% YCUI}~;XI썋Fqil><UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'T[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄwfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O .|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI>QQ!'7h,sF\jzP\7:Q\)#s{p'ɂN$r;fVkv߸>6!<̅:xn<# -BȢ1I~ŋ-*|`В~_>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!sRE2K0p\0͙npV)̍F$X8a-bp)5,] Bo|ؖA]Y`-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘ3ڴ(|e@ew>w3C=9k-{p>րd^T@eFZ#WWwYzK uK r؛6V L)auS6=`#(TO֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[klzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IX/C7@u BGڔE7M/k $q^hڧ};naU%~X!^C5Aw͢.@d!@dU}b? -ʏw |VvlK۴ymkiK% 0OFjT_kPW1mk%?\@R>XCl}b ,8; :.b9m]XaINE`!6uOhUuta^xN@˭d- T5 $4ذ:[a>֋&"_ }Oõϸ~rj uw\h~M il[ 2pCaOok.X0C?~[:^Pr򣏷y@/ڠ --i!M5mjozEƨ||Yt,=d#uЇ  l]չoݴmqV".lCqBѷ /![auPmpnEjus]2{2#b'$?T3{k>h+@]*pp桸]%nĴFԨlu |VXnq#r:kg_Q1,MNi˰ 7#`VCpᇽmpM+tWuk0 q /} 5 ¶]fXEj@5JcU_b@JS`wYmJ gEk2'0/> unKs^C6B WEt7M'#|kf1:X l]ABC {kanW{ 6 g`_w\|8Fjȡstuf%Plx3E#zmxfU S^ 3_`wRY}@ŹBz²?mК/mm}m"GypqI$6ʎ@lbx\<uV?.*E!qQ5m㎤9I͸,0E.ŊygcEl#L)(g4^atNbe7}v+7Zo>W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$sٻ֦ea^=2k'E9uGzPpg$@8JbznE\DaUW@GG}hSt6"+ҾT )iIT)?HǬ6iUH:ô 4P$rڣJeiܕZ:׻X{,ATGbRp# ^ы9'iӺU_$S3㬎KX^X692r$IqgJz4$IC"6^g^,e`XoO,D g)@9 Ѷ6,Ê=j 8&f I.I{7O#pҁd ⵪s gÁbNW\ "HKbd4|]>XZk‹u^G1Ƙl"oO?tPLٖ]9# Deg IKTQ:+XW:mKuB`=`LDhhQs|hxΗ3%*v9(IQ:'uQ+v-vR3!4 2]4Hn^$^tHIԜYLUҞͭ ڐn}}z#޿?oǯZ r͹s6K7,xuڑvnzjl>=$ts,cJJE pKB d$tʸhiCpB.$jɴ٧t!*x) 7ff*N7gUg޻P_Յi6;Ę^U & hz38;s-7bҙ+`JG PfDGUhrHt < 1 F8) 򅟤OdW>5Kd%퐛'nYmtYͼ#$ڴ7݇njӞh2H[eogd_|HH焋"#%'&,ItoHڔ6n`R֑&+.Mv0v]I˟_vrM}F5X|FI#g.Gi)%!iOto}|յEK} ݨQ}.!HΛ{Nheiߎ/Β̥}߸$ W!p{uz>x۾vK;kN?7Ɵ+fx5S_u{1fR7OQ&-/E`.Rtpo0-ʅϧiޚ/tA<3)i>3Z eS.8?ad0/0q?!yB.hHޝ>+EV؎Ypv6_1;'v !H 2($_x?0K YfZ`!@1@~9b(3Q#R>`T!A>iUlxk=qId9J_]j XʝAQGsW%QHrSO7^J/ť_Y/ݼWw eQJipFxrH9ǽv?Ս/8. ZKi_e}L~k2MؤYe/{!ca /^wL[H* JzeMlTr &|~(vFhT-dO{yQ0Slzؖ<'&ۧyԏnro |^Ώ)HD<([g|9i,p?ݾ<}e"4T0+=zx\?/pB!62 )7:ya]N!~4].m|TO7ӑЇn-i3,1Loƣxw[nZ}"uj <v?:6V;B!s ̧)bXyRD `MHVuV_K2k*acKxuBG'&24T}Lai 0Va(K#ӊ!,ZDx^FQO*lם=!4ӥ2 ]â. U`V%`!c%؍ʨTzqKh! c.}.D>)d 8scu,wf2AU9c>ˇXfqT\X)QӅ#tӚe~=W|X-sJb?Ug3X7J4l+Cj%LNPFx1ƕ *0W4Q>@>lW"A X5G-n}.8B>NOI[ 1,j2Ce |M>8l WIf|\q4|UkC.r`˱Lϰ{ xr)~l-ɩܿ*7DNӢQY"0`%޻I>rd31V_ѺUibap5k_JA+A.A~ C~`v[K_aQ-Ģm9٧f q>1`nOF7/.7 !ې: Z%ƶ(f갱/p  |T!|ik3cL_-AnG i\fw$olٝ:^Izq) ٽƎDnٻBc5Lt{3#iyy >Vc11*?x]aU`J/AcL~|?yj8TR#s&Q.ϊ/Yrx+u6*27fǪC%+A~*Zآ'ѭnס|< a1sR T5ғF<pbi\X%d͜u-ss x9| +ݴ98V= %æ~m 6S+S)Ea7*g3IC~LG,?_C5tBYpm_ `7Lz2r~MMp!~?~h?lk5I_,?&I}qcU~7뽱Ϧ,ce7CuoDT6mn+^ߋ^|Z=k-on+sn*up+t\OW U-C_wS!|q?E-S_w$-%9?nwh{R 5ѭ^9p -h2 dֲ 1"f({[]Nk"-KjLOgeΏe|Bf".qx)֒t8E)J\8Ɂ,Gulʂ+lh)tid!e³5d ¢[u|K"kP-%ђ5h ޒpM0[|B>+q&[ڲ&6%%<@fpѻKQ 1pFP=TU?!$VQ`Rc1wM "U8V15> =҆xɮ}U`w۸ہ#t#|X!~Pu(UeS@%·b:.SZ1d!~^<}NY aBRoJ@ѥuȑz.# 3tl7 ]وb Xnݔ[TN)1LjnۖF+B9l"UP۾u2Ja>0c0Vvގj$]p^M+f~D`Ž]A12JQ̛GXL:EUپOY>Wqå} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v1A*:W?*n{fw*B#d[si$U>tLΦÔ+XGX߇`cu0:U[tp^}{~Yhi+lU&Z!֨řoҒ"HKX 6„=zwҌ5+P1;ڇ6UE@Uo1^'CplM)0\n ή ?Cֲq9H] bX9^wCο -nd+pHUgRy2ЯG\ B0)% #>aJPUck\Ul'F瘏" i '2S-55YCrE~b>|Ž6Oj~ebIapul9| 3Q#tUqScxTD7U9/nq)JYctmc nrctVDƖϧ;INOKx%'#t+sFUJq:ǫf!NRT1.D(7.8Q;ІWT-aQçKnͦY;c؍UXE4S/8 7:" `/ +'\ U>]BfL6鞞,+Bg#[3`pO^>eoB4F\jtctUb.L[3M8V|&jZz/@7aV),A[5TpUZL_?CU0E [%W%nl x٘n<Y^llmb rY״͖jiTI!jp.Pٱh s1:W_IaZf% >eb̯[qM!:ɋCBcdWu-jqX([(6I#z)2`[ ضزEM_UA| m.' L,C{!./Ep.h>hBXULi7.1-Qk%ƹJ4^ple;u>6nQe UZA *^Wif]>HUd6ƕ̽=Tse+ϙK$s`hnOcE(Tcr!:8UL | 8 !#t Q7jknnJ0ܽ0GGL'_SGo^ʮL_'s%iU+U+ȳlX}i@djӓfb -u -w~ rCplK;ֽnlmu[`wdй d:[mS%uTڪ?>={6])|Ը>U{s]^l`+ ja^9c5}nZjA|йJs V[~ۗ#r#r[i# zLdMl?Uc/wfڛT'W;d9OUo_ V:<}xZhT+soY5ȶ4/.m0E(G2[+G~H'5ipӘ͏_DAV$cZFvbRz4is],`ٺ%U'ٿiqLulG)p"Ȇ{ pb.ƃU6/|N`$2т:NGIXeOvP&o զ(e1T%ɖ4_2?T5y8UP".5xGyr*mhʱ4(L\BIv;X) V1*^uQVfk*0Oܞ$_~CT^8DȗV \'k)Pbi@NՌ bqp bp[xp[I-OxeɠNE2j2TuA|BSp.'Y! 鉎4c `/a A=N. >arA%P%3d@OIىNݰB?8_aύ/;ikeend}^\9aG~nOdX֋|9:ZbDQ'` Q<}wǓDֿ{<&"3ҽMCKfy3-\ӏig[`a 礟oj/E:4>ħ4"I_ЅХCن04{fy5gaNH~w<,mMڗZ ^ M -]QH;"m|2.uҴ/uR/-6tó^;/<*㰨-(l?` @x-b8Py|dۦcZZ/!o"PaA=iJD2 ˸WD/e^-3xPl ygY(U?3ٮ/ Eowo 4m͚V TY <R#!s`WRg|UyV' >9B.c>r%PX2as}e?eK"`RU/'tWda9xyUN[<*V{#(ʊ(NN-'Ki-ņ*D]4a YJIXgmw9/lUfQ5~8IP\dK(!EuU%x{oijZku v(GA/( > + [F^/;jl XTͨ%ΞHȣ/ِnn|o3G핃,{iƧהydqr%?A#:FW"I;U jݹwiƚHrN,dOF4, _䉤wyv~6h_q[O@ʲ >c8Tiv_!!2W8atz.!:!5=qɃDv!yRO2H`m?2n?d12՛߲W_8$P']&dkA_O0 %!q\BܚBtHqzgy`j]^mwϧa!(u$(AI]e LPBWuNBآNQCx!8](k{ jy$O۞-SE\׷ko~?xg<&{sx:LĂ&z.F'b f7+kgQy2`fyvQWEIj: zvnXĒP5ep1h_ӝc?w뷶d=s6g)x# E({E0k~Kt=Ju/[ !гs0su2!zsK: oRuU\ ({-a#?ZmK'#pIS K}aB.OAp۳ߋΧ)yV8j߂GzG3%:+j2nUmOw\YU莌c&q 1dXt}55-y7t՝)ixD0=U6-*ZSDs m/0mե2G.FZ,9 ڝ&@˰#ŌmN5o.q* mN qFv2mFqT[NȂHzK^_E+ݦycTa lVg6gsúg 8-諢5$ޢKsj}[߅]HUaCm! bj}9+U u󸯈w vV`6j&T7zeWYP|W US2[:oFUYĭsLۺaA4MCT*Ϫ*:quet N[5`[Eh{W l|`%Q^OO2<̈2T6ʦ[kc2'Ilww[Wrӵ$flKuhe DC>U<+ cA,<=soZ)da-q,+s+%CH~q"ہa,u4Ћ|3age˅;Os,S&GC*j  g1֬vXRwl=q[dXp]@GaXPuEU9+{6ʊ|,Ҳ4xFRU6\ \:iˇzHwc\{|M[Dt :p@׹d=fHM҈9Gʟ55h;ZJU˒ese wuwht}% ==^xV;O^ua/ZrW2.!,]~{Fb;Y]*K"_9]媁Lc%`8S N>x]nŭ@G3 aicmQs|CyY^gXDQ9|Hk {*niz{{tIܿ(sư1P ygN]e*-op+GB4~qݞ6$M]_\.dqw{l1<~ѹi 6W~Nloypxb콢uxF۳uu*/ o]><*˳$~vU#VfDea:-I#g(0޿Z2sE3S9r_ +}mҰyak fw7~э}{3qdžӌݯL7~J"-@10*4f:aطitBo5.N*a}daXa:?SHfXf9r3mU,bf8Lr(9g{(Aھz ¸!3S2x7ԁ6cߠU?3Ltf 10S߉mn F|xsIX\d'n,¦-&3&瑢Ch,¤?s y5I 6exs^4I< B Gvr!? B)) (Cae_#{?v4̤rκ(71>/ ^{NN z(@űx*,Ii<llC[*xևGzcCH`v%DaHPiC?{۶ʀX$΃O(;IƉ 4AF{ I,ٕ}V9fyw9̈K,5PiwYxNN?>yNRʟ;*`̳g C3SbMogq4`4؆  6@qݮ(ϵ?Ef/ޮg MB X@ b!0 0)2`wi'gwPp  D7h> l+Rc2tBV9+w0 FmK> ;,u#0TZgG9$U$@1b5 ii\Pp8NB-/P(/Pʀ&ܢD:>?&qScNgـD9`x'A||Kka ,y::B,&E8N;PPSɀyxĢӽXtd0FJ6BmR^B7|.n$}m8 ֕#»hcY_Y3P:v1uf8I, ?[>]ۇ&T0Lurj 卩 3R#F]lYĒa ?G`h7 B{Yt ? Qom Nk3}s ðŨ&A|Lȩ^`e]`{F9 (A jlXa@m)L:sm.ڕ \.!!?a~W݂ދQZ395,߼-H䎠>f~ AO> f"TFUg6Z ;@[Ϥ"LF{ fU$t!ש );9 vF< n ?WpHavav!WG wzyL8"EJHV|*P: C.4gb4|;fQ ~'"&:<) R56+25ڼ;#BTˡu!ϧ*D 4̢ "kpq6T`Y{/-ދY: ӵ adFn>y z fIia("ݳ0xu9S~\̭exs_=Ҏ@Uh(rŞS޿BY #"ǚ}KE5RdȐtBd}#|Ve'::tx胢՗\ӕ˪ 2g}jZK 8y Σ 2M,2/yͪZ`8{d8tE6?`̹yI uD`LkJ E&'Gz`zU1iu7E)U@eFg@ CyV>s*`2I 零,-dt4IbF=&7X'Mㅃ$\'hVh6bꁲzf7{ɐ #͓̈́m_LD]]>ق_!&hr+a}#/BL)dTʝ|2d;ɦ `$b0_r۹\X-RD]]ňDŽފ`{G[ m]bZ4wcFGR#h,* ?,/;z11w"} Vt%'翦$/P!;ꮄm\]Kgr)8-n {@OI^ d #(]a77wq|!~=>𛺇Dh07$Aopr ϋ߿O5#Kރwo5 MĨo.bDOyC19U Rxz=!9uaQ91OEo;`G;^AiS<дpj67;,`2 ~ hqC=X9*K֩kIWdTWpfSwQ2ʾEW ꛇ%ݳa$.?R݊&y6M4j{ 3lѣ͈+51]iU,_o>0{/x8Ip~óY Uh9]CgU,x  O"&Yܰ襷8(F yq8r( U|RFQ Kn/  ؕw]X6Zd&)G_COkoC:pء$۰M7= Gɟ6aʡ?y{~EWp)La m`7lzJv:>? Uƞ~kG^ȣPx0e|!Y[;mGI꭬Vj*t,c;m<Kާ8Pg'T,uӣ`ogϧxZT1v>xhM/rxQYL+By Jn/(}tGA6RM:>WJUҵO۝;weҲ-eʞ&*QR|UP|{A; ʷT *Tl/xbGA::[l/4Au]BPw{Aݧ (ުz =MPoGA-W^Pi; o!h*hO4Q`Kq . R3@fJ$[e~=Yy\i~1ΆDAwf:+]Cf 7*_I|p+Tt>ҶF$.qտ8'Ny^ <)): Z.q"4W=NM]VH9Kcr0 tD0t ~bY=Czz%_eD#4a !yYD.-Mo}ZzeUIHeukN8A /Q P'Դ?~<>Cݸa ^-{ -g<-/bcS j/`S>z|xOMt-UUXi*e)j׫:92}e"N*.j0gD>}Z+. lu1۞@}*E4! F}fh13JV;+%"1իΓ· {&VǕ ҃2W<9ɒ}6@ ;puF1g2h 㚰j{YeYX`s8rYE/p#⌆"AdD1ô/?rI!3"\3tIb<8d^2W{UP6>22fo-Tr}IbiϺtQ(ߎ(- Gare0iO>,bU҈PL8 JښtvxMPziC Y5Iy7 f!?4lRTOd4 qs"!Jd7h~9aj͑q#0h%Fab( T 0T5 A 0a@Mj6U,gr(-B*>jEͼ`Q{vHFd #x 5W&!yVtXdV;'TA-b?VSG'`qslEtS{S֨M6ō̎ * {9 P %o7k hQ+ Iaji_9V /1_EH́G'9;ȧ\0-3\5`2byUܳEf-hϊk%swz_\Ey)_f$J)dɊVe $ڥ=|Gms!8ߴd7c/YJŇ"|9h<8[_Cb 07 8Jn<Й Qq15<_3(ਂ a\?.8z1G'JA@9U'1Zڒhbp${y+OF౉d˴a 8%X>.vg##db Պfd%{<2o0A3r +&MxXo2&"ʪ{;cڇݤ7Xm҂Aj/Z xiחb|I3$LFB19&LCCgJ0օRI P&;06~ ˊX •ЍEPCD<}&!գX*sʋ5v>> U 2»K1C&. Wb6XahAC9떧& d) QD RiWbcc\)UFNxtTl?K!׻KtR9X}X8Z>WŅvl7Az#V`NpӪ RGp 0D):&,U:.2#.zb;<tOj[ 0cMP+X߶IK:{`[~(~ ~$s׶@! D:,y`H&Q,nG %z܍ZhG7%Qo0%`އ1|S# |{a9 'նdo 5Ji~RKsQMSpZ~pȜ C{+3r/|Oi18[VZtAppTٕLZ N -c%j `cG˸FR[FZ]tcɳ6pPOK@B)9^1I|Sa!8ڠRC h4a# P(e%ślſd`{h˼xfc@S`1<q+DY}Chv$ҋ15b"RIUdńJwq4~b&|w7_0߭,[vĚȈ"x*T,Kԭāi_ =ϣ[0\9e+f!RE^#Rmr$8z/}p$v_b'˫/^I,a(^0q5G05'K'gܙsػl01n}z_+L)s!:(NplwьB$'t`͠R0DnbJfXG'R,hz]ptp<$Bao8"EDgto +DN!4Yx9 7tA>8PEIZ MQ̴pZ0#/*k9bLYO&]]8`(z̼ 1SDLC DQ,!v~ $J ʶ/&D|\1i$)%h0gYr8Wp\bdmqkL1=-W* E|Ǧ|>y\M(AE-DS>' w6&k5i+1$Sƒ0r-M$ϥn4/TL]mO|/E!/r@Ji roέYO52Ý~]p,4zХW_;B*ۨc Yڤ|\]L=wtf^_z`{n9fRТTd[t?}R\lߤkGyN*i~G9Xra"#1Ѭ11Q9X:isQǖ'e̾#8BZ T]Ԕd2r(k8ߓN)}`Rsf;u"[)BrI/kE/K 78/7lNx3)eG/4!RG jL`F0fh4 @M-~?ٓw6} gPƴ wq0uܡ 2ǏUW=p4%}_hGY?.o?JxtQx\Qv}:K"]sxs=ѣOMk&TN+~e[[N]W1){7V—MZV[ϺNmx_lϒb.YW?řCf.S}rPs$L$5Q4b>}xrď&8.]@39BG dz,IMtJSw]np:g/ ߍ2K҅E(fND8/]qH^a1t W`|ǐ~6"(Ec&2rIN?.8*{Q)'RO[Se^tG*y]r.%Xn1DzXLn{u#w lpIB@>y_(f!C aҎRLBL:"9b#g,$ɶcßKV:cɂ{u. ?tԋ:{deM2NA$c]7z8|]T$,"R N`FarNcDN:MFԯ.`s ׯjR:m/w\N>CjюGOڵvb4qpӈm),Y. J|b5Ui*Fϼ]C*}e@MV}- y =2`ښ b2KMdt{luPnGE`bX_񈱨\L%!?q` 9gXs05X cXbcXKv(!坶ޱ-ԑ^=]+LlW]pLl8yI|~06 O9./k$i/o(]&1O&Y==wQ~NE .V[p=SpW`]fJۋ*N׷wz#mZx/'}֓YȲrtx %OӵLd3L1ӭ ŮR^1`~Ő$<$76]T*!)5`|w=uhM`/tM4NB@WLf5P,AYa+.k{Ф {J&U8X~\ɀE{{xp5$;;ٚozz+-{AP2oK2#.X9fY^즿svr5OD<,=u$leMJt_;`RHm=X߱l_V!u0"[2҉?~uꂣ˻Yb`3ej2{s]N*]W󔤙A M+-Χ,Ax ycsB-ai]Su\ ' ȓ]K 8K$|ȑ4eQN]Pܮd+qSYB҃Jϕr>StAci.JnjB(Np?:w z l۩i~=$\c`:ܹaDOtKڌ^,ZgtkRgt>jJDR|2r)zBi#x|rwJ$Atve{k(E-8 } _!neS큣\3Eye%^!$;x9ٹ췉?wi'tq[v8Evm3mC$Jtokٙc9qj;+`8D=<|9!3 24})$1$ҭ8BJIe=VSQL Wm*8ClGbU)N2ƆB<t+.Y{8?*U":euպjmjǙp1D4n'0!ׯfFq*%TVp;^]|xK j ~>q>\ݔ̊oJJݮllÉZ`\?J~_9[j|-5Y׍|D@nF#ho&z|YdQ.R~|ntAo؅`R?^]/KLr_+C^E"g0v};Q\Q9qlZ5dbJNh(ܵ돣h'Q~Z.rz +OA F7Vrf<1ZJ=?;#-iJ!{e yry9n}W~% Bkny(U=ó \ss~6n \ ?u$x :QhQ Mz_7`͍;>=7M!h0Auv~jID+1wmUDU,no)!ŬrP* I7.D$|kbYm9r3hZTT'@? t5[3Xkfpk ^`ˁG֯ނ<`Ge=7\9dGW9!эD?ẋ+- 1t[eseU0 ;*C~rpN(?JcMu4^:Nk|xn{u5ё|1?IMQ!? _˿fl| `22?FNdDc3:Ĉ*z}Z o?x z)] " TU염M?<Џd;WcKt7_|Jj!=t!<[){?rA aޘ3/M9i PZ)*Dah)N|k⾏(Fo]Aň|>HK1aRj @] .E ԹAT$͡ vDA m F4e1֑I2DEC KcH6Ai']nGר0Dt   K,mD0 mдY@hP4( *v79]獪 <5-8mh+Xs؂s٬ i|a74!DFu?IlP'j.!P\D].rg8?>en4r>ȎX' c@b~G= ?֧ &cz}!:BY/~V.ZUtab śߝ|_(KoNnݝ-W@fQrÇ4o}s?$`W|IIG,HLJ1wĥāsb [>;.+ܗN34k$o 4+6j{% "ѱBg~Gp+mQ:]]gi:ϡU%k7.XU,N@)6UNhgUX=+r$TZdY [dA޲fvw`n; T[<5Bjh#EfR¶;myAA65;Db`="V1w1j5HsT+euyǪ9_RÔMhR_lPtma^ 8A|"u/۰ϊc-;tXNn/V:-K8}{%?+]svԿAacL쟌6j}2Is1^Z|ۯvGޔ }9==.:]1i;?./f΁VJ fPR͙Ǡqa4wa}^ Hk0/y˷Mp*%8_zP r-]}"PIU_B$` b{qEoWŦsξ9@ޖ-=E*QJѕN6k|H_+f`~o?uTc(4HVVoW7-Qـ۳ Y!xʐDp -ٹbKо3.\zKwEM6L毪XŦOߚ}k|Kp.+/uio E~8eӒ{c ^JJ1b/,!R^W6 ^_{!D+ DnҾ 0Λʃ~G̸?dIyák' !M,c'18MHu)RZmW.T$X'\Qg}Giַt֭(*gxj8Y{[i0]y `r|7{=yމ^(9!H᭙g c96w +`a˸X#:%ZNCQcFT!?3 #JhfD 7fFԋiSfDAZF^̈* 3jņQiU9\d ha5K4;ѻ@5>hVboH1 > %a㾎1tN(.}RE( z9HѰ!4o/W܉0J@~A| Õ;IdW޻IJ",!Jp~W\ oamf»Ro־/Ü%YtgsgSf4N&"r'H4) sq.S.%ʡT0R1gT"O)fSOsfRE0q%H=R oLa$T$I8fMJ DZ9ҎjV{꾧D9-PD+x Gl_(NATb2(X c$MR:)E$f1AL.0Q(xgǘjkJ |cacb.ƌBPTSʠ)k-@7db1_`LFBIh!VY[  6%8IvHSG5ZKJjie1n3!dJܛ jv;4i n߁O[PfBF@*)8EL x4% a !b(1vC|G{i;mmPjmK nずmR._Jhe:4|1%3ZS(%I*S&8cI0 0<^>W2EMN,]/MPxMIDM_7XW RI_7Q^/jAQ5f-Ȅg7 a!8 $D={Fp]lC? şb6ѩOii08 1*R8kmH_7q]']"^pqg %jp[=CQ$JHg~]]R(=KfГ(WkֆNZu~jDV]/G`=[hݎF}%E%[!|y($8u4”=>i>RB(#dT"G*@,hZPR]Dh}JBdthky!`]Y4+=_0],zo, |&OʒS,Au&3ΈȝȽ;e } _JzzbkE'u^̦d6V6b6.Kr@f{oH:&d _K izdP?KzXܠR+n`4v 'Z%^iWUk5@&rr_ c%iiN/'L>R* 2D[+IZɳ tVAE-_ != :d, 4@s$[)nPA"s$CKRA4dePVc̚1)#G 5*R9%1 \4%6<  QiF0tY#t51V0w1)pKTj-1 GzPoMOR͈0/V.GbXw]\JZ,^ݨ{x 쀉EfgFەʞCܻ9xVI(y>SjH+vaFC `"o7,^`~ n9MrBH<~*s|1uϖ<Vt&J/3]..d!95/IyLyqrCJ1Wod_?IŸVGtfW\|MYl#4wJ0+~*iJo]P?> K9ȤX`RI*6J|KSIҶkWXO儦i5mGۍ{N6;`@Q2͔eȕ[NkR{yv`l3:m#쁶sRi ܎-g@On_E r2dw~iFy\U4 '#{]z˳37L|U.pJKQ2Ԗy9{ze0’Ԉ#}gȌ՜}dh`J Kv%iX+9Bj!yq+RaU&0 +7y~p3]oY޸RA+RĎ$ 5V!XKf`->f|4xwRv~~LUSF 4>6V Je@SOB3mpV&Fqݡ%{F w+VnWUᄒ:T7{8&kTnt#z@\U>ٟVrAJ)̏cL?=x"l*VDSd2~xD%7[ћŶ'@M4͜*88yu(b7C7{('VixFi ގVhҰt:; euy~JEAۭgz9(H"{A(6h:۹\=!Ҹ%!wt7 N2FQZ>n$fh LɄ<`ifp6ZEmKERG"G_bDnqeRшV Lƃ ;jE}Bq3jiZ[!;+_ynXN(uxK 9(olv1ͨ㶂zeyY.i`%vdh9Hr%0f4H%CS+<SSXZ+vj_fJ*{=W,VN[Ir2|1=i<Ь.iQo09? ʢnDWJh^}cOj yG&oP$d ^/fRĜ}yԿ)urV=?#j*^vH_-k&hlhށjZ(X"ٳѿ'qWtyd28t7~j҄ +WdBߒC/X)0|=-<T7Z_BQ=3{ʛřaaMrW-\W,Lg38V<% Kr򢍰ͼ&>jsZ|Z!_t2?ݯyzudQ-a_6xw`cltxNW[!Pnӂ$BC>2Uje8.7-&5ɼd]srOnQb27*Y]J|^k M&fׇu.Q6|[U}M.GOp5SW<X+N/Tyr%Eז '1p>>d8 v\N5ԋ8VzyTFZVW}!#GUHzuѯ?~\MkT7o&XW,W_^5)Uji5O/7LA Wu}8xIP+^K%@kxճ܅ 厧O6IS;o<%[%V}o%W\;8%QU}&VR)w76hD*\s~lGe&BxAOǤjK .'L`LIpiYְ)l;Ze Qd_"X0XJxǁK\8S;HɬG2hHMHB[JV>%X| F&EHR"W)/$ŸɴU l13NzX"d{ϩ["%Jlɢُ[U=έq8-gHTK@yUmɵ+0 ZQ*ژ[-ޤXL *B*̆5lh33昜* . +iL[$~} Ȝp.EH.9msC@P*ؒ*` m)2EPil~B܂NVgdS\!bR 5˔ lhBA;1g*4 9bvH[4"Pa. :p,]," 4KWS"zY|RsyV̀j@o]?B0.@Ơ) yg!ĐiY03uUܴ5J$Xy޹Ԋx;ŜE1N0ts\Z)c*ʦkciT66%R𻍭 Eokd@8JVJ (*$6jƀ("ȿz(UbK@[/:^ TER+ؕH23ti)hw $!(byI+Fd0էZQ $WEQqjTѧYV  ѩ-C^b-rI`̵댉pȀg:c Ҥ쬛[dƬUIUŲT B {aZ~-;k\V01mx"AkiG0^ BP88(tdpҫhR0uA"\Y LՇ+ɠ< XmATƂBG1 $bDN W$2+B6VYQq) /. D,/xT;7 X:+*@DU;͊qn[&sJ $'a OmwAy'SaA[wmHuE1+VX;KnCzw;s@ M^@>bhlu ^HPAʀvo1!5Kp VhGc;f20jxrI'#.cUm^1H#lܶQippuW PH})]"NA`&YbEi4@?x2ZF v7H-28\ɂb,p0"oY` EZ2щTI 0]n17,TLkTRp0*A(U9-=XNc-B@̤A@ |f*KsO N CLJ{5=DUr~DbjG#^cTk{bj7$!6u7CI2cfW)e(%T~&6OT m.?0{Enu1E^Hfeab&fTfT~LY?Dy2JN?8!u:zFj?_/4{*@%d T2JP*@%d T2JP*@%d T2JP*@%d T2JP*@%d T2JP*@%d T2 М qL%<%`GS2@s>T2`(%}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>G,}1},C<5h>vm*qKbl8!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb>qTb}{اWbXiHCbmb =*#}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb؇>$!}HCb}nGfsm/ގZn}|Y_?_AJy~5|0="m 6t_\Ek]+aM"m܁"Q@܃3M@R֪No4X.FmNHI="&9@ %ZshRyBhж9c`< %UkͳGhJ(4sLJ=弗8oM9g&JŘy^# hи4V# h0Iv4jS=JF)Q@[sŹ[9R }_bnOTUq0LwW!}(.O1>.>?ϳY_ll!雳T(>J/),a/`qOyޏa,c}_./ً/y*GO=nzdxU6L*|u6nF^|j~n1pšf6gRmR7G~mvw٬aQ*Ts98k_;`gݫY猬i8ҹ!x$e<Lσta{HtOԠӍvS0y"eE)i X8/Yy g3m8lNo_ )ƦE4JxUA_ ⼞kԳqG<B1oW? &٩I 9s~nhp1kjumb6xFGok =|kuwsKm\2We!.a%>J}pQ&XM5ulÃ?ěq_*({]C^ܾ{ -?pi8]:?{2]CK/!tyxap5`NO; w&~&-0y;fl|W~yoWpUMԸA5~mᨆձe'ݵp4bp::;:~(k|2r"A^>(%Ϋ3x\ɹu6pW2C?WJfYM;fpl1?Nj+8snyK}oMec64]lբj2~*?<ߔ&ŞOar2L;<9=nn ֛oNUBJEAgI|!g[`ro@)G3T`ңA|=9مfTFbAIĬZI 1Ul,VD̾bpI]6q̬5cNfPݏg }i;XA߻K[9={29߶Tm wH[>9AŬbp!z{MP~uqYm&kX y[ת6.2.i$|4qj M:X4d(FY]IRג6ˬTsm [Ywah齳̈́7Yu㾃}>I?f!sf3B7tɂfDy;'t0#+z2cBEf7J38cpC60w}.xާapPrmsWQѻ3JxMu0'Jh\ i !f㶋ʵV@o?y3NA3\W\#о.|gj1 Z$")ŀs˵CęGExX\+WL6buEFTZUݫ~eoqz[2V*MNN\ $Ci fu}G ckv6w1];[ >6\UoײƶK2R⦋N碵N;Ulm{FOsV%oT)M / 3wm$_`! r~Hvqɇ-kҒe;=$55 US#:Ǣ:1tr-W!ژDdX\P SPTHPn{y_dVI8ة3ZLPH;DGzdr!'|g˕8}}^!w de\cI0J+QbVk̈́Q&`}6EmXq|]DeI! sHH`Ȼ`=?=0u?HP%YVX(m",6ɮF"Ӻ-F $V Z "( m?o\`#rCAvPtZ(ζ'm"4jH^I0_duVc c# Tin)l>^F=2j@ni%w˒W.c;ۄ溦%dSJASD90v}(j>ۮ{GFvC=9;ʏe\L-5LFpO;߶y6xQӹGF327~'x9F7p$Ffa cܛh]g(|:>m&-Ϋ}Y]gI|__au79t<~^w9x@(Gq~7W4P}?~ycϚNru;Xbc"Fkˆ::]'ыaù;qud4`KcR7D%~\yOKC]KN]6߈^̡GtO=<5^cD9$w[àa"VKo>H4H=\%: {;X(J&i_ 0 vPuLͧ[ß؎or_CV(a<Ojo_Rr1h];lgg~R2gs>.6Ĺ4Ů,HCl|ǃ'yFI0+gINX?by1|,ƒͧ C:1~0:0~X.1Ps&CמArI$<'b#3Dms/nQM"rJ(CBGB|,>}~7C|VĬ8z /`~gY 8aS1^`$ O9ZJ鬩*C=;c_ 93כֿuG#+#e&dzh*t:*Ggt ta8 #Br,Kt>F.Tz3&f0.9_~FyX9@+ qbBs>o^U{# l/iV]2|tNr8^'r\5,jm'VyOo 8!xtP#!gr1}@Lz+9͙mVv!qὊ$+N2kQlIv#An$(Oh`f3 yLzO6%"$$tub)Dzdr/T^:W8J8o|$$(+Њii"#m5n5|$FN76&rQ5LX z;m)) JZq͒XX -("0K9~|̆Nrd[4T5'M(D ld7`tEWe*r]EE>k6aǙ* tnu#ۊ ô*i Fz.i6@ }L9ѯ[.pjm |vau`>|-УcޥG.nsG~ QGF]ᘞ_j ]#6^r_.~@wTyAHťs0cYx-Y; PlTYeUhϿ ?%9TQq:Nhx֗%XKO(#V:.L9CAFэGܚ*JaڋT3z>Or}3#Rߌކ Qq1fըbP Aדgnzq%E(+lZc'BmI(ld߻!!FaNR=eNA0|81c`{4@Ȩ808GQ 'y/[:af%x|[ iٔ!Њ \Zg%܀uWW(rDti@!}TCTreܑyŌet1K o$38w U\Tq19C;-ge|"U@5HЀΐE[z9y{\G"(N:$a݀rO={/_5YgSd7 s {dTVwXTJH% |'}Qn79>g L XJ;:}:#b1f)rQ:,gPYr|.da K q VO3f͉ޏy! n&/>2m n8Pa/86؄8h2%ɍPD⮵ &PMc)cGF&F躉Q#!2CookRPMA{dTMY'φSPW\VafϔpxJ.ެy̽b?3bܸePQ)@}jm'o~-2 .m>7Cfl p~ߚo? Mc\Sz|w?|״6 dumQ_˫ ~0͠vvYN\ħKmQƺ|̂Z"!Y6e5"d#9/璦6elNNfR>о(CV|~ZۼXGi00gj%oǫprZt^|-AM^]ml~sÛOȻWWV3m'ZW~&k9|/V8EwR]IGě?xX36?,mgG~_?bF~n[AMNTԱNoȯTѫU\ˢeӻr ^gvmzz˛,7 'pyd'-fۼ{7w%7--?0)Wxw]Y+ kSUMx2EVw7da.PzdT{u"޵j׎)xN&Y XИH:Lq<G>W1T]#v-쮈e.jKd|ΠOf9&1ojȭ:r=9wge9<0yue&w],E|㰌j2)o`&|ʿ&]E>q-gONN2jPUsyu BTe*Py8eJ*=եB>Q5zC>W`0ߧŖ. ;f^oXlmvNtÖ6;'-(Us֛eUQ=zdTۜ@>_iR5K68~V_DȨY|I?Ų6&:_G )^H+Wy0 !Z:>! KHlhWdYS[O'Вi)v@<0Pc.-o3UZ:̌ܧ73R'jj;&BLium  *6 juTfƮNT/uPWW|2kqr/K6m@?&׉x:9! 5jAbŎ< cpwbv%c晈./_ƴAo1ږ>T\ ta4qE|Y˼OcJ7GTW UDRWzDRoYrq<81 cN?cK:a9FU95?vj90hLPcM**ˣ%#ܓQޓEo4tmY>-)yq>mt4MVixP8ѓ,.Lh9k-@є(SkF:-fftALco[fI_n3ڈ$(dy0{_4♦/E~*s7œY{<:M߅Z)PdRl+2!G(pyu(Q6W`isK*٨Q͗?F6Fua e+*l#{VMZezcZx8J-*LUW(Y|6VKuqVϋy*[o~* >u/o:zWS+Rsf)lmn'FV{,~ kave暜Cc}1#W[c4/ezq|MkiًYy!(i}c'trlQyVyF9q JɹI* 4ށ]J?]ΞX QM`>߆ Z);/٪C_q9%B~YN8uYʥ4 Ԝ$CAf7s0D,93tm()JͼZiŴF' "XZ hF Ne"S{K.MP.a{i_Up N-0pUfRʉI-VbFv+vnbŻN;a9{Fk,[l?qxd1~J?_mr1bQvNvtitz纈k1je-rkt!%U˃C)O %r#o{M[r&_:2Z&%2s $;5[=/e7/N*%E.)a?.HʣT(L=#]ǁ͑h(gH  )a[m8et-T*ް;(l X[cH0p<@XtNUDHBe#(PC2Oa9+1ϔSH.*Dh{E7myq^yCE£ŹV8deptSenjHHD,ƤyՌ~^"142;wiՆ.U,K..<%4P[Gϋ((4ʩk-;8/~2|m#᭫0\I6]MާLI S+Lד0/c߲Oep&U txw7n]byWgӤ*$4R?BC$&0I%|jշC:޸]ػq\- L3PuY& !z_CRءGaweTڋ‚gtg#H𶉓9cܛ\Oe?/ː]A#ͧyM>;\Xԣ4= { (";֧3{KhMEo(h=ql߈vY9̸{PmX?|i7Z;4$1uL 32qd߂~L23p' |ͦrU9I2r)-tZf[102 mQ0ͩm<!9':cYz!+ hMEo(h=Au }R;`b'e=!NJ%x1aQG[n'P<CEw4`R:߫WUMs"\b^1^tJ=!zd0GOD䩇%ĥecuy&;t2hE@,F qroEQ$"0)^Σ ['` H'*q=c֮{6h/6>|A 7FQ  ʼ;=_7K|pM< ~5dp)kQ—1Iɹu ׅͪYlFۏ' O%NΎ5m2͖2d0|qF6g!T=N"F<^ qk.cweGH#c=un-0-[JDMi'hY$Q>z#/k&nHD;bt[5!4담R*BК$xIMs \'bAH Qu}M#+dxe*&~99o [/fa9r׿"u&|]wZOwga[>bEMo47䉶DReS3^n7ϘQ>CX+1w81jCfZ"P0)M,mFM2uїA38?G ;k)P"֞"sTcd+d?,11PUDNw~YESM^i ! jdU%^a /`2̏Vw@3DtI꼡C}a;KEv6QS9#-[NWeAfj{t9 "0:F4G+5utzy(Ω1u' tmYaSƏ=Bt괖 $̒JrTQ)x^̇(^$ ncCܪ\ .f5M 2ſ1>/)Z*fW0QL**ERKN13U)(^pEY1"s a\˲7#ց(2 n3jo=0<^ 0ZwLt>%o봘?-^fuqv8zb;A Od(ܠ.Mo]gHƝ23PK5Ww][w~. b],}}5ar u2K9yjFz,M@(1x`'EV&Biɪ> Ӵg;M8r\ 0v[ L'O?OcgΞk6Q>߯Vm.EΧ_Ci{#0ar6j~~*vkU,päaˍrٻ6r$W〻˜M@w3L_nlmk"Km'bnK-d #8R5YKU`AyagLVCxSe /`0[WWmRGp`O9&A!Yn{HV:n6(#:awAp؃P3 2Et t(=f!>M=\0mKg YlGR1P3 *.(`i9|[Y,z&T;,܍/w欨gvыDMV4,93I@Q1Y8 aiT{Aغ(],"nbwK0~V"Ցəs"hԐ8آZ#$ĴQ葩PcUDg"#@Yʲ݂j (wA0f'·c2';m^cAeVhw˜֛C|;!)c0׵lU͚:9*y7}@j-lvx\S%8v^6Oe|M*b.A<^ yb&/F\DlR&d),!30ZDJD]Ui,D9q;~kXS/a#k`ccƚ>FNQ 6E|tM\efv\j%N ]ħ0b9&q3;ij?xzD(?1=ITj hӑ)}HQP{50j/w-ԀJʹBpX+TrR (]MG"hwVJO<u#=r K^/VPqMUL֢!BI@N0<;*A|~ڌӄȕ*ƈ„3ȃb M?qv)"(‚i 9CMrTzB_BC\уD3LxB%0ᵙ}tdoe(7I4>zx3"y͘|?8%7״[Ά#?GfupWXuR%mp3DUݍs"\' <ιF[{ջߝt; t^wwuX(%)wϘh6^}=kl{R"~^fڌͥ)koΔ~AX FnLtCs9ˡƯ擛2Bd< x6F?8kyWޝݻj C!6QA=c;ȪxMgni?qYXnX%WDSv4ɸ4CeWchЌ_FR5Ǧdlw3Zzyvfꗷ릙&ro~}_hW?>xa)[=cWDay5Bp1w=M'q9qLj<)Wl?vXxxQt͟ܨyS5SaYۍfx󉝢[#tW“!)Đ^r \ka嘳*sZD:OMT̼zbgCX`7K @c5OC"LVp h)ͷW~)f*܀F:]> +wf>p0:e@_nrkkq 'j[pķyA3ϳv_D(¼Y|7W*OX1_l{%Z+ԺצZ5r~kf糛:?_V98ao9𜺻ѽQxT (q79|΃mk f'Tռ%(G)q;QxwgQG)4LSڟX|-ÚT=_6RQ8,??_cnʲk`*PBKΡ?NnCCvAiaAYdfK GT!ZEE0%ܚ )¶K𯍽0iːޚbo*2*T7iW ؅Ap{=R$ _Zg6N2ѪZ, ,$d7W03jPF0¡sξ71tk|G疰8`L]f:yqn6ppSEh#4 H`Q\ E0Bs"O B4d8B G .0⍬K%OuCw$DyЅ rfrڔR/|?r!aI e'V̗]4&d;2"ܸa]]#K=/S-]xdaPH၄~v2 q9XܫUB U/l@߬H^ Exq>ưʞ'/gƮV q[БO0.Y=(,l2U/۰*tfgCj8{/ x"H v2 ![W~vB~{=ȫayߚ{YГ's^믪_T7/=eRP_Tx^_5^@~~5w zS?m֞08:ϓيeZ`q? g,tdZߠ+ZLBcĩ\)!(D^Bl_Ii6!>yv"wTr]w;%>rN^˵<30y,5p֢.[O`<%zT {y:y n6zY0NHSwtD=| קs[P~X{t! C9 #Dw:7_Q/u])8nb-_LhRJ<~j+aPb qkv}&qrXN.zH Ul}?ӟ_˃uG@)M7#r7l ZL"`I$a`f"@+d43>O6MʅbY.x;@: ɷPEPJr-XzUȟ(gE[P#HǍH鱱KcO<"13 cF%tZqkrbRjhr F(TN Jcю(_gm:\kU`#\ks'ZJj2"j-!3= i Pr,(⼂ `)YqE;`5`1[gH&$a@TEDœg GGޔ73 Ύn55у@aARzhQ4HYd 0I aIX?^BL t IKP~bMeHF[tX-EI^)>T;dH{@̄ze fm.KGҔ!rvjŃ-R^GզXU}xUS'YF6ńJuUytՆtp=r7R,8so0oFq7M'2ÝP0iI&\L]# 4^k5l;lpKCU_砏L`0$)V{ґ>Q *4C2ùG,3 [3fB; :Fh1'(L%($#oH`@0 1ÿABl2BӑܗqǗx5x1;$WrDrtU 8fATɏ5 -Zp70fcX0̼P䲐Zd/2΅ؒh-hKLbVU Z>Ѿ[_gK/ Iͦ&-/h8 +0_hEV[XUmպV!TV>àf~CuY.}L GUa6`?r]빳'<$lXmv?u2kvz@U<Д|Q%ͯZ8-/N4_À~jfaCf\},IwಿFYx0ONS fdzC]n_M8ɓL|գ&ܖ"R8!7Yj+9>ڲJYUhSPnzѧT)oEm1̿ i D֦pr.QID(i(0!m sB" gC(|5H i褂 F!8SȨ[X&2` FneT2?+ckaE%hh;&P-КAm+p!wL—O6 <هDهTv YϾN|C\g }O&~ Q>YHÜ~F6|>X:ET']~z@xO/" Ƿ03DQ1A/ibbO.ҘcFyW+BU_ ?~.Z1S8jzlUy<U aWxrl4n]}3Whe(=B!sl> qh^⩲¹Փ|L&坶77e^5t҇ _;l -#kfހ bB=*0‡/%Y7k>`u`7A]QEx0hx.\YD4lff~O<ES:~BDX6į_[(vxZ ApNqoPN 6Ѹ:8Hd)qGČAmdbvNS_o^k*b;RKN/k2sBeNHU=K,+<~VcyƊ<8_ߟ_W$?*qa=r"ı+dDf)P?".;9R>s㹥>|pO{;0"(Ҋ1m B1b$#f5vO: 8xUq5Unl' qeXoZ ʽGH[03g҅˭tlD%V8Y6|HskmVp8Jwݾ{}kERGv-9ɦ~CvR6)Ţ4d~3և*JSH4:P##-8yp%zP=䐈{r],u=(pNr#31~ߦ>e)e!Au YX ðE2s[k ĀKA?\.nېϦm}Yo7R" a*(F"̨ܐaDZFbpvy}( KIQ9`޻q}>0wlZHEX׮fv\@X n,,\ sQRƳΧ$iX  EaV>%{Ked`' KH 4Pw#)3ޞGJX{JWda̴4۶A6Uq=`A`3{SaLPeN2ôGFbphPGCIa>3!Hi&lI~l3ǰu:C4p\mKQ((v(H\bԄ>.Ko§TQ=2H MAoIJ0UrnQG=2LFW4VdZg qi-;D9mn ݩYY*K0(-/e#̈Brzd$'BX!e)T0.ߙu` YΥ L,zpb$"o? ™rlQ) )taJa^O(;`)CeaG0Tzr :GFbp"lp]ЖcR㵇u^͚rjwLFGi(0# VY(|bTpGv:gA*MQ $,Sd@)֓'ǁ ).B ,4#F`eEWrrp##18E+뀂)̞";Q轥 CfQ,t*J\< r%##18>C,{R! I.0”9 Ѐs9=WG!=2C{? 9d#dGFbp;Ӫnlֆۣj6CTP  !"j4 |K6!5Q~YQRΚ ;79v*g{? l[H E28,$s1ԝC`Tk` DX0GEi(rW@bXd|ǍJyl{d$'.|1_U5kW M|xZ8dOead {OfܐB`@w J@2};&Sx` 򢄵Ł 8Kaӓ i|J-##18L77YM>MvjYW YIq$EZ !!jϥ$(S|b=2҂"841`Ho%$E—Ϙ@QUBѤ(|zd$G4ᙅ{G3([y˷Af>T名RJA3v'P}i30RRaPO6-$&*R)Ng^%Ɍc AC>cCJ12PY2YT ' -2$oK^ *s##189;- s*BjUAljCJi7WIl=aٌ;!DfNiɦ:Cу??6uÞgHkQJP\ иA=2 Aݗ?tn`ve~Cp:P=%"h굌-JhA#4uH!xQ yI*PzT{AqAʼnEYǻPJ&/a(ZFZpXŦbZ皋vB(tZsp!iANi-8=&Ibxa^;C!6Foӏh/ZO HA5KDŽ*%ep isֺqsXz7@P'ANDMH); ##-8EDz[P(!qD&Geh kA5!4?-8qmkQъpdV-2F' 1,*4qv2TCxm\[gag8ZJpVp F<A:.[=ھdk! kG9'L(/8Ck%&Lsl˭P%Xfl6M MPU"CD!5CNcUʔ]aAמ _RaXyAUzd$Gy`*oUYlxQ@W)A}l?bSfͪFV`?0|Dln&tպLPX C܅L<;A~'9*h$T$lXYvsCbF,pqqT.aTd=riT8/쑑EޝO4Y̗M[^2nA'&F!X{B0NC^4?I$wVə?,_$$0*/[aOV%.hZ%NatV(TGFJp"eOՇKWu{/u;/vw$3m#n5ho²ouWxh̿J&m|dnKg/&u };mO-_y|ۏibnY^Xg_~\ҀRyV,,ѓ,&OoV0 J?kab*SLCk\P{23/&}lRWK^oW71t1%;ujb}7\@/s7[gZgȬ1RdZHNYn}ۣLǁ:rcx;y/ "!R Vl^ e|XFc [C%߱a'㽿7f}N+|q-HھŠ>E|Pc=2=!7'`Nzߟ] |iBn5gnߛYwJu>>߭,?aS)UͺXqZK1zS,Io7FQKY&w3L+-'ZsIw>׿fE 8bmbq xkl(k\[:L$[Xآpц\Hs]L>j&]+kyQwղ;&_~v9WRn&pok8tvU_r%r^R|l$k_֕-[3[ݥ?  +[}dA(2˦gLnۋYFpO.y[x9[T;T.'Tg1SέWA^dvߨt~h $-Y뽀ύ}^KaKn,sDɉ,.3L2ZiAKԱBaM[VjРZ`G_i޵C"bU;2@GtI&4ko2ڍC ޴2[(]hazSt|8c t/ҝ4 !#Qx~Ɂa!P'8D6$DMlkq_ Qa&e*t\ ? n~?_ :b<3@1!LYXРr^飔ȕҔ(ÂgHw >'`d-OmOlp0ЮF݊eD[P )ņU}gU<˪v98g;G?ǹQl +Mi!-˔ٻ綍$%qm@*NU^*/U{ѩg䒔(DBJP%M3===(t6޿ص/v|{QqoFw,v!Ā}ן^_Emd:,ydȐǞ "k/%%\2N,O=c)GIf6 '76fnF9].Ee˵|1|Tv#BˑQJQ}kx(|ѯ+.L0O{M\x`Rg(4|1Z,eLv~dz"HoE[v@b_^9tς/m!Rݥ uK.5wKݥ' zEߛ_\7Z^)_DE*+eû_/'WKe/.{As/].{v4Mw nav0[[aqv8g!]wϮݳu:u{v+\KLX<7܂OƳ78 e)sH[f2'{Va7k>xi&&VQR1qV{t˨WL#,8ZIʵx~SA@}tws}}/~"!3>OW?؄mU`z$FW/]`̬3-w\I+)), :U+p‘_c RJ#UT>lT-v.ץ!#ᵗ̏z8sݪqIj!Yٹ 'DkL8 4qP* o 63W8 6֠؄ A2 cpeZb"g^#jq#s =~#5RFt4.߽WH`n;]GJ.f15c."i>>{/iY=wb~q3B"alUSƻAF1, hri\!?ڲ[t t^u uËYfP! )S1{p~wQafh<F%hD0AP5@2An'A]1(/aGyx8uEYSDJ̐i+8c+1JQe(rEl'"ˑ&,؁,pjє ,(ɘXJ`4 t)uHH Bw>$lm^bewW"g«0O(~RqxpcfڭX^b΢x9L>}8NȃD} IU7)5N`(IrM r7X+pHMzٛ)(l7Pş 0iC(XK qK>NJoe%?9G| ,#׋U3瓕d"RRì?zK#4(.Ylyuz['ʈʷ? ߊ'o.gGUxxSUĜQ6|9+8Kg+[Uhr^rXN->ʧńGxd]%0K&pr%*ar\9WF&A|t !=))K@A % zx;hqSƘ|'"+żZ{_% — %+] 12&~:H3b*.~`ɳEC}: -&ML 1螜|M9I;-\M7A6<}=$VCD0aCF.v7j%~Q w ]HضS1bJ9a{c"틹>Vݨ3W/\qzZ &w\\;.E9-&G}2.s-& ⇜J4OtHe%_a煿 /r ]w?7kSw3} -\(hK!Ҙ8P=ϗ,' ^1Tk%i滬d)q awdm2[{ciH'YOU(d[5Ix//(i'_= 2i%"*0x#"\p׸u\hRz,an)z=PyL֣Ƕ(l><_d~VKU?B΄Q "Hq$[X8aL 63:~)Qg\w{XpFHXQ X# XZQv9r?ƑuԮ"mS3YY1.Y")A3XQ{1Ep.eQ`` \L8o\5zC}QZg«0O(~Bqv3j h& ߪ3-Pה .;8'waqrfFkH ֨uOiq~& ?8//Z!v\s%3nCV īQ`WHA)gfpnLy/ uZ>[` ʜ85iatHܘ%6pͯ(CaT0dcO p_ŲpLLnF7i{W^??JI|83th1 *C+k v^"hF$+#ZTL(^O!3ߪx>b~V~pTf79UH-97l8rVp+V6xb-Y; #k{b|}Ouݐn4**Ǡ>ƹ}C*> /]987 N^ bu֍+FYܮ Ǻ?.6}1 1eo4}ɸcFШ\467877k}n '_[x?<nwٽSۙz@LHrb/AɤLKlȴN+Spp;Kg?v77nfEџ:ݛ_"@=3=`hnX&C6g=_ǥ9qVپ^^a|y=2oS=vߞWniA;TȉY|őpTA",â0@3]ʑzt_gj,[u,0/<4xJSP93c;9QQNy"TE@ ʤG ,АtlkB|}JD;]3?D=D"փpIwZՖ4َN),0H(4ž@6̠7!28U[dF3cӇ BcSήdZ5Chlfzom̃ng!Mvq?e-"'\"EiQ;ژ;U121"oQ4f҅pn 3jI+$p3T0ǙҙaDI3^Qb8w;KF{BOZ] uD8p趇`NzӇzF^lV2()=1_dRHx=PqV6T{xK[>= Blڋk{_d]&xrokt lJ#F<3o-/iR`)o6t`J!˜*KkSy5TNBSB,_8Gp6v:['N} Y[L\Shlj#êqᰇ5@dX{#8ژI$LĆ|䓈9LjrB1bT!xx)Fc+l_D`!1|"󛸋my**J5Vc)Gq{3 CQ~x(*!5x֏X*F!=B,1QgpcDIJe *7£b5iGn7\8ja G & (_U!xښRiei⩕7TqT:2cCP"yvQwkw*A@k5ol7}fk==Z.`J#sYgU*21,f=w] sZ3"g1,ZcYӪm^!1:`#u:؅b6"14#$ׄ޷܁8`j!SL34Ge[gyu:tΔ`+vp*՘j(gLf!DIhq& q>ֲi87Ss^C.hx;[ȕ;SI1IJPHl|ρk&6ȡZd<úvX)q;ʱ%Nd\fB8`0L1'}((?Ǻ9s%\1FeR(J3($(-<PIU,=f2wăAڥȀP΀-P#!q? @r"0s X  #UGo:R~{Β'oSF>qp,5=51,8=N0ɯ?'ȸhE?5* )% zd)@dC֯8N) 7U s1ĺS;̉ad:oMWT) $(׀ [Et%֭[__3/=̓ËohpXj?29wԎQnPLN?.N:qFdc>FQ1lM1R@;z w'CO.C]Q =خJp;/dQNd>؇tVPzcl|DZfBezn tpn9;]0f4= IbYp i0iCBP\CqA5re|`;t&.3w.qu=? {gᠼSWI>qbzI{uRzfnwH{MPqi$S:uv/);&vΙ|.׊Elσ]-XO't?3! Jc2x/;r<2̏Ƿdr _##8YEw.C$LE΁w YiaKT K ܟ_ϾK&YqοRCO!=GFV> 0h~/U@#`I%,Xe1AӯVfwZ'T+2ԫ{*KubSX:Znv[s 0B&lr5hCKS4GJNI@N(HBHbE$=|tVUuG1Jî|\p8H **1by{XWUm_,]`P SF9d)9Ls7  J8okIӡp^>'~G{pև4K5l'|o+rDl ӜHJ(-\3@ߜ6~d@>pb;Xq|z6nͶٔ(`pkgq;9XA,ss JZJȉNkXVo2RXvx}n:rw5(/7,誘䟋¦b[¢ 1sЛj.z?^_})eWf:(ɮŪNde}Kay,8ObZ֭fM%o tM~u*(ԍeR̙Y|MN;/Wiŋz=+Zsv`l{~N&R 4,˨fQF<f)KfJ"۫E`k#P ܍^l+; z< L>#4ڈ1:ܳPTǻt|Ӛ1P[Jfծng0zD=`#[n'Q$RL,$$c,nC5r5vK:VE["|3o}3 hF/xl)^v>2,0^-[h]tK=f&%(^7x7JazZ<8Y"lijAĜ&nQk}.CvYMA? ෸FyxF6Pv8sQe;r PV~2Tw]:綁&sZJ-4SkPzDsNAl}xp=:F_ޕ`%1C >&0E0c`bl{f\g%Ѱ7i,>Ou #AmRu]ļ,6f}ǼƼPemqM!;ri #zy}nyt`7?guCte7 dGӟIp/1¥,%JT*Hr!<:z#.mEv +.S#O-˭;"@B0AZimi'_ԫ$-ƥ xv=-n|_Q7QI!h)f5{qv]&"ȦH!ɰ-\+E% wѢk~3hsڸ&E \ʪl3N8˸湄U;hlP($ȪߌNt1#=]-=ۿKJ][zWCJVg |0ݦ׻/hu&H}KGDD b FрG!eLDzgږ}%݀zj@-%z:\U˫kfdEMYOx)㑃q%9zGeg*4+E 62(-*GT $! j`DL*hc\ѫ[zv X?!{֮Z2[8^dx NA2Cz5 X^nH/7 HTaj8::GG聀dw _QpsY04\`D$/hȷ4?XBgmm={DZ>!Xj XN 3 >$8$(0"%pF* 9C=q w3"اL)Þd 1Y,-,xGI4 ׎ܜ7%&N'Omwr-3]auXs}}yՇyL+,j7nw_zl 3#noN(c!(E&DDCAǚx յZqlZDosswZ`D8mV+cy5k(v-݀o@ %T8᪬<I%LN$ LeH>VB ^?_j.W^x6w7D1~ W闽j }5[0iد5k-YZkYE2(,ՒV\!zsޫ? YxYSAr: g)MZR[?K!/rGg4٤nկMQWE1+ ?-,j4+ t<+T[YrA:uQ ^ӓ 12. #^s#epw~n7yMM&pv ધ_O2.I_cDOטd7Dܳ]&f)Lynl0[ekx24V >@'U6vc_ 9i¼TR vG#n:KwN[Ix,_1s d-u<Ȼ&KXr,Ac$$^{4"&_G+aޫҏG+ۥT/>qamS*AwsZn?U+Q]bʅ|>_hX kz!:An'xڛu0^{3U"n,cՑ>̲c=A][j4ښ&1^`BD^C dM̚HHtwMZsTSצl zc~vG#n.:FAs;}0kQk=sk);3r)ztn{ou$^ТoR&S BM  ׊XZ[y:B#x_nNsraVRdz+O4wA&reOwO1L.;ISяYf3.E;`nm [!QsGrkuc Q[>X`V&ֈ*C1F~;Aemi 5"߼je8 q02Nq͍E+׼eeHN6Mc,8hJ D(Υ+{T+/J8om&YhI[gƅ9ٿ?KDEz"g1z p.5;m ̘5DV%rTYeX~ YEAX(+~3{3e">J&Yf'X+(T'bNl嗓LIylq3 *:mqkrT&iIS4E!:pt(__$[AUy)|ɮMaF7a{*CggGRR@t~9;qۼ.J1O)C:HF"|z;+_O,k|xcyi Bj٘+h/G>z%Z(O? sPh/CucM7tV iFjIa@Ii/,=[97tNZ \뢓m+FY *>L]*fI̒8'8<zP7*@'~) Ǚ]u߽>|ů?]`.oa$0/(NښMihGTwuU5Է*Mժ^W^p]nd3;r@~۱n4s)җub/ȏjxW2cW0ͯ$Y"VVl Q}ڈ-H;:(8#p_>Q7_lֱnfzN0X(fV5CemcI"BjG5ǭNT jHq\Q;>H6)#te'/=pJPrYKa2eKe <KS<`sXnu '1bePCFgb}2nt yMLrpbysp)Q3f4j|9#,RRF?q&]{)Ik+Oyf:[=m+URt 89W-ksaH- U6*]A{TEY[6ymfn"bVkeZξf[iد=|~ iV E2(,Ւp*֚ U^苗~|!>x4CW=wKGR,/+<[]C [,{S~T|U eegՓEZ<ʻat<+T!+MsI{Tl]N2fltyhE+o/zs,Gf>,faff8],XN@P.Q1N᰷1W{%ֹzr>[^9ɫ gym,%b!~~e6W=h4`P}Q|F5B~ Nk>?~jZMPT|b0O~+"Q`"PVz(#Of<tpDHADHAz*MGzcǜ{DFy,-w[(Bxk@#0aI!5]x: ~J0dbZ}+:P]l)D ̋:+J]õCi*v,.R\"EHqQHQbUp4yCmT+T#".0kSq |[kǾV`G9}{1c~d>BvIe[Qr.`9ϧחq~}_SA c0tg$82}]_o.U~j3)/ ^mE{(]}Gyɹd%Z@rFϵ,wJÃ&Lfe4VלAc{صZ)IQN)93ZcN&zG{) ?L Vh\\9!). >ajeM?$\H}WAˣb$ _ۭVRVSȀHv5Y$Xe$(&AGO?B&H>(BnZ Qq#%$,Ek#S =L߽C Lҳ M[<`33i  ;x-c%"Y%YVZa?nx;xB)J\$k2檖K+m=~쾆grIKb~D9-Vn^p)z{[U8X7UA<2O|0.:FPVJ׳$` ײ)2E^ gXLw=o+1A^i-FɊ:lJS,\lTFʃpEMX xQ#-?BtĜx,Nfn7c蹅G25`xP`ł$1%BeڢHOEKF!tD#8jO,LK?>S*,Xǔ,be ([C3yeNS޲ב+&udzX^ݻLJV/&xު,d)zצٵP*; VJZ&D3~.n(A햘B1v'\b;Y^C lEƱT- 9#O+9 Mn%**&e7"X>^BGSbBj)J[W FJ5f ]xC EXF%b2R@W OTulso~Vꐥ6!UrKelcycoEm,Mxn%\PA$m =kP{m%(Q2?SEŚ1?X@SYj"D^6hj6V,#$g.홖e`Y4"dYb(yOݷxBIvm4kS9Xe轳IiPڦlfY`|V*0u?B7;( BN$.ʖjP4X?1HF"ADF%&ձúS,to<@+ڝ?ѷhu%u r^ aGWԖL>S~>nݳċ󭨖m!MP;zK3>F\U~ ZZgxKP~uCSQXd軘r>[]AppwlHɫB5վ3g⠒-"ڳ7lm',3/~1|~zvT0^R[,.wak]W0$HBXLnE퐎Ghɘʊv!Q$4xKNzIsG:PWZ͝sW☃/!Vbɉն%UZ*u )Tj"_Viv^]1BbԞg&d NXD)]c 6ά/O\\3Hw$ي?ofgf/fb3;cXڄ!Uyڅu߁Hd&p Egt8/nv&;vCR%B޹gˌ |6Zq~.ZicRnm22kÕ &`Q7%p;DC%R4;>I7iC< shWH!h$4OY\u^"5cb)St`Mեhi4BNyDtt;ݲF~m2~ޮ?2bVI%lջoi>o#MTTn~+yV{~6/>ηK:sq_RUm)UۭwI7>p]=n`G1sqy 6.o*#t]mۂt LHzNp'+N UYH62l_M~xwpMS { TwՆ^YÒûK̐]FaPE(zM,ە"3szuUY_Ylg믿 y:psWܽEˋwp?>z7[vLn.ؚ|)Lr!B$m`s1DWj1uNP=\1;z;Rw]?{{O~7_JM_Dӈ2#q;t7QvGEUU򮊖wUb;z]5QV#Dh'$:S *K be&c ]RM@.y;g X^5Vo>`p~UhL9=`T;bz=O78l]ϫtlr!mN_҉~;jcotϵkrhu[s+W/;{uT~EwR?-}HCV޾o3Lz5꿾z~W?Qnht| &9/sUJjYߠLN%{m|.2*y$P\RAǭ;tlJ,,ʸ\UtfZPZۘ"9 ;جd{$%Chv#'yl7 ݋;XӮNަ/t҈9m>L^*>آh U)Dt"wX@cTU`F8k X)hCi&UӒ1qJҪڒ5Rf(!jn%67ȸX2=T%EKBp´azH =X#{AUCw-we3|;϶>vk/[; 3G\^JD rJ J$Hi",lp-Lvqj 1sgEssNZINu-P뮣ӶB=ێѸa_ydGܰ/yݘ:ٽ7N񪑥^ Z))Ƚ)FlKB2#}) /D].Y|f6LCXrH9k6+7D`ol8#jz3F{WQ/폃C㾡9"Rl|[u'ВʣdxȤZcl@I9qG @J/$dI55MHPk[)%ࣵ7"\I,`c>F~$V^:Rߎ+^ƾ6I2u:IƠ`xJK|ҒS!:>Hjnɓ7r(k qݭV˩)wm]y,f?lF[Y[~npdʐn1N׼>y|GG|{9±!r8JZ|ljg)Qf3yơ֠q\.}}:^^fA]sE~UsH(EuY:Ow7p'][>q_֑i>Daß7u"*hm9yG0 o Oۑ"_>xv#xp9LԴK-KWoS2jv[>kZt?@~hz[ ˋ_|MnK)N߼]լ5Ht4_|y1l^}ηͿ1R&;+qFE藹ȱ/A618y[cԑm;<#:|pG9 bU}UdU=ݶuڻť1uQ) YZ|4|k&z2۟n8 {E`{GPLgN⦚k8$vLhlb^/ Un7z\ni5kXXMX ӏo~xcO|ODӛyK31Q=z?5"#逮ICMX})RS~rKC|ln@CCbbooFw1VGF:3Xڞm^ l~[nOznVFFs5mnm)B3#*-kД{-6rѾj-cNZ x}FSPx,Ii% SUhTP&;9*I642(珧qh$ m摛:R8`P>$PRCN`S;dKnBNM|[OhH& `o7X89[[{vSll'nHj|%Y|GǣvpU>iwv` lE/hXwU4e羏X&z&*Vm7Y:K}`Wj͖/5T Qc^yX11w]Yq,:eHɅ}SqߛNy DaQhbY6 M (<@}HS9 ᐞ&0tKu×-Qc!FZYlyr@\3t6r$?VƟ)_ y:z &_SZ%~|$!9]141A^4%^$GfM_2E)`t]HLņ/YYjy}B$Z JUm_4qp͞H7=ML)9O9Ymzp_6ת:RF!Đb: -h]xPA0F(oHwRmU*%H3cr *@D[!t*v<,WJˀb`VR uHi4z14KJ_n\o6SQbNܸaozዚ+C^zoI.\xw54m~~ [b7EZ'~͎>-xP-Oho|U(k?T舫8"WPF T kIS.UUL}>\y&}~5`Q`T܏qŧ~8nw; 7JPM_#c we,TKÙ.ի L]}Wp$Bo]":b Z? ڟTLjg]ý ׫RR_w>~P^LI%|_zo޿U6}eb ^xv5ļh^XoAφz7=cZ4\J!:gwӨ' |L@Cz "F .auW34{"\/ŗRi. 7Z?gG'wR[uzvl:ARm}inV.=]Jtra:s.` NBc ᗓfw [~RIVM= } ֺH1-q`2p#j=AO1! 0Hh1N r7<Т:U<8LjrB1bT!3hq )q`ʠi͒ gwZ ǣUY.rD^E z/{cߞL?h* G=w (3Lcu#&JdG VǑB[7 9Լ<r_|dG>, Ne,إ 67go (SnZ|(5#\zP`uAdTQ*o "gy7.[N]tSuuo_vj;-*ی΄` I! N Q:JPHX܅}æ\qD/F[rO)^X۽p7 RY&nle~xje=8W{ kZmon99)pP:g5X[RX+[V׈ mà(HEqe@ؒpbVz8)S@K#LI j΁0)q0[Rm%@z癡@iy^7&hkI}FJ%5&k U[#(^jd{ 2 DIx(]{rlD۪-[k6[OZ~,s{Fᣲ`FQ.` #ˑHG$y`H!V ,}F)hA! 05(SH5LÐC°k J[O-0:T [sdxbӠDMѯ]G!QO6o3T Q&,Ok7HUf<@C/>w}^x&] ؘ;zz 1 ݋tgq =+R"~a;+v=<ϩL2W$ PG\N ҕk'Ϗ%B(HDk L>M eַO;A]躘އņ/z%HriňХ֟ خA%_ݜ tsiI :W^u{u{d:F5gG$K*7v%uЮ6rņ)?pѵ+{KH(Lie131#;Es$KL}CvF׉2ov/jz'S^M1=k{"vN7ycakE&-Ì +̖BjH[Ob~=4|N{DL1U֙3>T8"hl#Cl-_* }_UzU>LE?\ݎߎb~X~.'~8nmqdc(G#-i,SKÙ.ի ]}WcpDEB o]P_:(½MZ?۟t[jg]"½eeC-TTWm:hU93'}:yWYYA³+^6φz7=G|3БΛ^i3:T|1A}֭kg++O=^ -c C'1:.Jc8VOZexUag\q;sͣ8`9vп1 bTʂZbe9Ӛ2&g#XR1'<]~||x{g>kA;vmӻݍ7Cv-]Oog7jڸ^pm^}ZZhZ7m?sJ0yIZ'Txn|vTҡ,Xn}r b1SH* n4Ty&ω"Z *4B@Lط\T,qџoSphL)DSeTOK dxF5BQͺ~;`mA"<5au* (BFgHrH;h}u= S78g>a5]iJd`M҃c $+@@JA5UBu ДZ),mq/b`VR uHishDyb]zsEX%wj87';s0 *3o4^9e_RMDxzp7՝\2OrSl vg屈Y>1Fwn}?KqmoDCz^/?00 ,M'7p灅6u@J9F4F*DC!%"P$;nQgYeSb5v2]Zz5u vlQ}G Zg)r'×%Μ33cf<cX֙C[+3ʼnb<S˒W{P;QA*55llBsd"mBEZo"ܨŲ{JRn%bV.8E>121U٩gz8_%!~/8*qE ,opCRP$`ⴺ{~I-võW 1)X"jg:\av!zѶu=TQxcpJZ0TZ:m}=}U+Tl%^g@91Ut7ޖ '-F߀\?i0L?oZ>6Gݯj_?-bna,f:3J#"LlfQ8HvI! j*%Tj*%j>JJJo}"8ߧ_s)(e-B`ɾA3ߏog(+(//z& _)yJ||/'>:up;h -4Ig=rJ*Bdi#i9wHHpB4gH59Y6S82Xٰ3L \ :ܭaxI'&_x4ˤ3t9W#A5?3Z yZ'ѷ׀)U/ӿ'tByl5Dl>-c5ZjG8F!$x" <`JBp”!A0Ʊ@|TՌHXGaRb&y@"Y | \KLR5խc ˜Ϗ(i5zjU(4I 4v2ȦGۻ eGs fq#!L06B!0>Y] 22Ȱ4 ^Tr PJUl$x>=WY^:HFId 3@wZI%HD0 AP$@2AJQXy, *2oqy|º [1GU% 1 f%`b2t;#`5Klan:niAr5"߼jjMc$$=!(`e[8V:AjWpmz"q)SV@؛TcQHR |W}%Zƛ՗5*57ƅβ4c5JyL0F~Q%XVΒ59~sviFUȪ{jlºָ;/@/=U2ݩHzX*.(TԻRL]Eû^1-:?7ÀP& -,I֑AH`]' ie2_&Ucҿ wٍ)EտUYz}_(TJ H(ۥ X`b!MŸpN\& fF̨j͚K4:{|vzY}a= ީBjN̵6rT/VvHƣ_neҶ|;PHL4!h82|a4.d7wB1׃8*AG'4j\bG-B%$ ́}1mqҦ=g ~ة:M =l0={،?~O?~>D?{XiYD&g3x0O~yЬazh.C6g]-q9mNa܇ɂKevn)@~~~.}%Ʋ83A#$|'@6sPduܣ0JǴȗtSftP斗eM6!zB/U,8/6ߢs+I`R F$'52 Y\q N^X( %¦S^Pyy)>šwk'c8|ȰɤF*Bq!^&/_nyΥNKcطѣxbЀv63J<nkg?tO;؞ss}o+z 꼢-K*ƹ`1X> cYGH瀲X"M]QĤߜZrp'Lɛ(S,KRuxbuIRRS^5qFL}?n?ew3ɺ:բ^ɔKWSJlo௶7eB!Al8-@=@z9-r͙A6k ,{P>.kc&]m{^t6NJriDzwa٭ >K )lLj>&-(bIcd-eJ#T>/)*/O'ip O~5Ob3*gRbr𿠹sҕӓS2΁|¼M3{D 30Dǝu2RHq!GVlK=q2>'.1ߜy#-_4,߼#D2p<RdB KMt>HLyHEVAv_ͦkHvrE`RbW.N=tPuiW<xl,_vGE`.0 ,- T)JkN>NJwA<fe~Ԫ$.){*ӳ54( cr^d>Z7 G Vp).}Ԗx93Y{-C']65MֱCai(Zn:, 7pPL=`|hd@aD%} yI"R:"=B)$,r!c0S:$ rڐ c}9@Aӆ=0t2+10BB23uS/IN Y$53GcmXѲN:IۧA(T.rmkn%TPD'S+%/PڮZqZV\V'V? 9W2]Ŵ44T6*BZe=>-]W] r#i]@d~cuJ= 0^/Ρu9:=,f]e8WZвF۽MEW;r3?$fbλNVij|KǕr/1f5Mw-ސBf[yI\ >%xn?[H7us?3waYX|-0˽D9G ;+E0eDS]vR(_wK,+Lk-ps(FuAJ`ڙsCсJP%k> ~.LZIS^XxIι"9%KȳZ#g{9Rm%ZTOzm{n ~kY-w ]Jq#!<”TU #KJ/@*9͹Be[d AEJ#oCy"ށvFǤ"'V Vc  BEH&`; +%C^Q%_I#x8uEXwa9HaW402CVpƸVb$5PZJwFcԈ~&18M#57ly J'H RSw,8hJ D(Υ+B *^&^oHeJvĭōqᇳ0}-N){l}g1ڇ U*J=YJ_OÜe94#*dE+-)5.`gv ؗ*_EʪzX*LS`N ]HA)g"\]x CW ޛaWiC(XR1!.*#WR=' yIΉ˔c~rQif/ʌ٬y21¯ٓO1NRsbͷ(zKFl]YoI+_fey^{i`ƼLS&$-Yv_5{q7lf_A&eJάEk-ߖbtg.jy!f4#^JHC}9J\CY/6G֯ IZSs Mm"*h#0xa|F -4HWuAd%l[u"%E/1X0IwԿ6/պ RՌTrC5iՓ<;zw6?,uUڍ,G͓5\{4Xk} 5^[c|%Dy 3?f&Y1:YYz4s/cO6$N3fm޽[7*|t'aqwMu [YhKyMv-@߹jnؗy٭3^njhۻf6KNo4 -.5n-0[`Ku\]X-(;`LԺ15ޫɮtPrismp%}w! y6.czz%@5ZVixkvZ`33Mh?ES}Y3X8u$_O돶\9{ҙwS,w+ϊ:6 =/ bxN +攘DMDv=mÍfxz֝TiYg!J}˾wL6\mv<; gâF[h 鋰? [CaG|rLo.7=sj&q9~qؤPqX( դ4YL@! +9!Q;9K9D׾R)C'Kc>ȩdiN/Sr?j?_'MtDQ1-$g ³iʗR+@I9ơ0`(#Tm1ɲ =Z Skrh#1/4mn* ΨEi_PvuWR[XJ:MPC%E0$DA3n8se ɡGd o%nxAb;g"F uĘ (,I+ƙ4MbnhT23LG ]'cTZJ'/b숳k/f~5g%I[ͨA`m ,˅Н BkE"^peh4QS@I=F#4yBRF p,(HA OA$_-*FImԥ\{j zť 6XiG"!x1$1s"zӉgқ,2]4F[Τ 63B",僱QzU3ig.(s3hnVnFe]sѹ >]|E]& 3J{p©t2 %D$nJ;=R"ԕn1'c~;tӐ}a üIϣX]MW/T)g&9d&(DS5bu1XQ\N/kع7bRJ6b}XQZs_Ml(ܼcg2>J=V+Uu%UE4-ږWݯj afv[: q8 ݩO=rr4"?6WYep7Wocr*Fi&r`Df?)P5d^.Vq_&#_{Ce1Q"i  2O}*\k҉"ڗF7v{AI9vY7L7;wܺ_R?k4S/iQw,J!$C!^ q,tJ`U Jn)`֞shR+"yָ!cr.7}.{mǽQONl7ƭ m<J pg.p'!(fJ6`d1t vaIHeӂTtz)D%L{AX#[bi1bahJT5 $( X@ry QFI{ gC 5LZHk"Hh[N**&j4qe"J3`K BNO CBh Jƃ ŷ(JJq*_E*5R[9 hETv+;o](ז}ntPl J] ."r4 KB"HY cwj8*/OPeKF485 4B]T`.m$:'-`QZ+\ 8s eD |s|N:c륛v2p|JV.'mumߦ4>s4>P~4vne4pHh<S]K)Bޞ[EJS9L̩[tKǪ%OՒrdKP#]B!%(at\`R3BDLD$4 B7F+£7H4o}PZrB{&.*_ʻ@FS3>!ĺNqkۇ].'Kk=n/OdXo3T'=qcf}yA|zpJ/7EoމJŸ"o(9~{ͭj9~ͦP& (:I0(AJC"AL( "T2#@'DLIbXp謯niP?Gvؽ n7z>Tfr=}/}# : K Fl,Ec >*dP&d%|+BsYv* Km`u/h5ELA1&A<)8g=qj2?v:~3n7ԖcIM渢,OYM:r/~(EX00rTN^IԌ9H9'3ȑ pJgzȱ_2X`b Xd=N7YYvt;{XRId$FlUyuw\P0 %F&wJB͂r1YU!j@5Y hVsg {&I!jtjL̮zj7:BלȕkhjU7&9BN~U .&;ZWкyԺ^=L^+gZVвSܼUyO7yj~H_wՋ:gބ7Ӿ't<]E;p\x0g>~tӊwۦۼdBEb}MVb}c~D`v<*\~4*Z Q) N[(ʅʾ}mMNȕ-'Czxrdg#%eS1y9鼎 Ȣ`;X+\$J&RHD#ɁZ`['9*bELJN%O ̉ {)^n3!rFd U`he)1H_w Ag9-a63;I# Lad2rJ#aQ[JnbL<iX{))qJ7Dd%0lɼEnѝ~V~2}*xh3弴sA㫗j;^hc"NXj!Db̃m$Xjű1k-›Ex+A>MQxɚƢ ^ {7{oS 7C⾫g k:"?&.L*JѢ*)Y#ܼ*4ѣwd|cX-ZxG۽5U F`hZmiADL'lؾs {q^Ӂ$*]]_$7zxQl,'͋zd95]ҟ EKp?>4E 7_bGgFpw}|}-Ls}h/o5ar3($Zbn/AX%d>"E0`O0""rug*BB1_*2z0;ɔ#I&5ܷ lR6'2͡LO=nL{;$-^4 [g槿Wf1MWLL \xccq} ֙av-D?OM)Y1Am7 ̈́wcJ'l2yKq`6Dkx2uu""wRz_ & NWM#;ʹpam q]A'˖;j<yDFY=SL9q%_h$atzj͸meJJEE0ScAW:\qE2"ҀTzR)Fh2¬ *7ZffuZyGDJNcRɜRafHHAJ1 f%Qe(оrAla}7t_4m#Q#X 69n3$ i9B\scQ*`V .ČUPY#b\ c")A30(LQ9J`4 \P{`9_d5k$no q{=8%AZ>SxSi HFJ;Y*p9{f]~ Vj5\\DiqS6p~D[ %W3 HKS`, :RL݉pew'TIjz|85iatHΛUy0'~?g`hNލiHӢymۿ5zt|usxTJ (KN&`F 2d`-؉K0at>wғ&=Ѵuo<-1ŸRo;nǗ ;UH͈2qп,]X$ ?ߎx|{PHLt#]u CѤsEV "O0iډcfsTV>dר]ϊ5ic!>mzױ)>ÿzة+:3?%_@O??~:sLx8@HP|k~ۊ fڡb r>t׏>N}73Q~Ү$ ̈́ۏ WGע4Tm*ݙQ|B6b NoZ1u`~=_be|tߗx~3>XDW$ USJ&H&/Zs0d]0ZrŅ68헢$jCF|ҊC Sgk'c8aӒ BL 3ǽ{Zzq0LfS+)ۙ{ԅlϰms𕃑X)/'I+0Ƨ Ƙcxt3򚴗;kHxYrJÃ&:3ޓ j 182SZ\TAx7?Oy>ɯ_[p#@j5?߿d|&W>mån;9\x|J}; [_p~UrԶVTRa@\4 D4 (\#gUxw6qu% F `E,T$tvbbO&y6-m-8ܛ[^0njs~\j>qIg;n/vEw{7mv4' }<$BLIwJ~FjO{S[4R;%8ƺ2P_f 3iQh 5#O+Lxu{Lt_'9r 8M(rneZGEdzj]H,HDUPUYoP %E/4E%x76+59h.FMz0J*،G= S۽y[w(Hi`&-\ۂ{M/K7K/^'=8*G='KI! *Caj3bN#^ˈiDk45[urՊZ=G]@*sTyv0=CRk. F`.0 ,- T)VJ\>4c]l85';]| ͎#}\[+驩V Qc9UJFHq|lGm7 ,3A k1EPf}V {|a:vD'7RڍuOTf0cЯEL2*PG J^B:`^eo) Ӂi]IC,E+$,r!cn*Su4H&$ڐ0>rxAC,+2dL+ : $H[XA`IrY$53Gc9dcѲHZ"i e#FEmzK]~lo[vl[ׇ~پ/elߗW}Ը"#R:K&FP`L2\3euT:&,,G`LK}6>yf=ɜiO[,lQKEI(E+[-+[%٢HZlQ-JEIx )J͎R(5;J͎R`іErK͎"gfGQjvfGQjv%N*5;J͎R(5;fGQjvfGQM6I6QJK+#3._(8 )T=U1xTV0IAx$fԌpY3h|y 7_x9KϭU66{*=u>s4 Ip Dl nH?_RW[340=x>\G7[]aq1Uvbhhd;*m\ D XI N-3~+q<(P*~U׬zPuDo:aoLGq|û4Rw6zEP{74;|t7f7hFhx zyU ?~1#B8#7S-: .D4r,#0v{3+r2#pMwJB͂rXYpHU!j@5YMplubge~N&ó7IQgڶ욪++yn%r%i㡻EK~e߶T6RAJOa85ʬn_9EP2uͫ]5j-Y٠湒a4~z=ϛFyVij|Mހ=K4U^fAi˫yosGUtD>4SuKOHӲu@s I}/TNY%9&g\!,r)#*%ԳWc=?\?#\? \": C) fQx,Tg9ĉDcG"s^vP1=ˤ#"( -J))$쟷Fz4t/pfT+8O}1`{&}A' Y;SǷ7WJ*K<38 #XQۑ`HCPu UI rH# >VDL ` Xy$RD[__o jԻ돌 FhqL'e}[rx>@PkQnAi'K#(N:Jsh5Z`(3*2hQ9b NPBj;@k䬗8C޽XU}rٯ<',y=s6!,'<"q%V|2귑@{ɱ6$/ϵ,wJÃ&ο6N,sn-t%nYtf7p]8&Kٝ-3 VR=T!KIbqn0.a?ȉR4g Y; JXI-&\Lw`K$s*M{"t?6"X*­7 h%zNW1إWDW xGh(r#A'>+ ri~= 3 jP9  }O]Agt#A"N҈S:)RGNi8̐#zK۽ K; y Қ)RR%nbi1g;JaظAPkAGgجEn9茳0ɿzM\`RGSJmΌ6>Iɚ]Rsn4gm_ziz(K?zaƶCZO4W̥H$ 4p&]I)9CRFGL5"2PsN` ) }PWyxv7{>.3fFv ױC;;,c{|= 6`#vy,h٧߳{W(`}ET K>ݭoU`tY'7ߝd.N0;|:Yw WGO,XPY(n:ˋ7^mdJ 6lwlNdKֶ d*t xQ]i,wq-6ʫx|z|>gp2[kn Y,;;i+&;&Ǧa68hfAV5q&uK<͆UX`V&ֈ*C1FÏ7qLFi"1jDypnpad&IV6Pż R]e@GĸEFS(f`Q{X8J`4 aP2exV[.ҺFqa3[ OiX0F`Q HxJqZ/:Jg?Svn"d|*5_Pi1?8^ߗ@W2-֏2; \FR^3)(8\x*0u^U0o_E-6`M@ZX.*,#WR=#i48;M<D>ϳ3#V87ٕ)HQWoU}:>>\^?%eۤ4X~c2 "Cf_`'|-J=#L~zTiG~-~,߼-^NΫ'10<1kordٹEl0NY'|+PSKL4.!h<|a0,`dŇM=6{Ɋ΁ilխ֝ljXrhuAC-!aȥ c>ٰ(+zEx粲ޠ|9@97N1Q';95X?LKIHP|k~܊ ^8yY[MC{uTMVmrrCŇ[N/ٞB(~}=?.lFuJz3_d_Bq* ~*̨rP!|N1iF~Wքߤ[VGܝmoIM(UL(Zsp!ђ+.)DtNҾWjsZq53* Oq aӒ ƂO 3ǽ{Zz\ֱnSgMgO<2v?[;U@%];4w:-.I-5Fn s^Le"Wn D{ewpK-5cS?8 *-^^50AZoAa51<ٓk .fF'dkCX3((dQ`4|&;Jv{ Í{\F&4~ /"-mwMUQeMȁ-rn$ C}-I4WG<y,![n5%'1beP|wf;쇼Dq▱gsSIk+jhַ+j;Pd_3 'oMK?VfZHy]iRXRF?{Ƒ8`od5~ZߦpnuJwahfe&>'L>=Y#O5?T<1`/\1W\.\%j)9wsd3Wߢ)|[f~ ~1+&ARNMjg!~ Z|ES)3 /P$\1Nb78p!fVr4I)H.L'j >w340,5kMg 9+`Tq6L2ven̻Ѕ@4F)˘" *RBjs~+)W|.gRBEw߾n3yRM]LEo^L'SEp*SœbmFc^{msmMl(|M"< W`|)M}Z)9 _> !jEhn+:y1,uĺ8o>:LPPf;$K/zٗgр-@N2 ,%& 4.>[bQ昷$8$(0"%pF* 9dп>fhp nbi1g;~aFXaM Xzv&j796 U Jfy%ۄ|?ݱ9hc9EpRP (󀡼Z86Fz-򶢼%z{!bdG43OuI~ .ݔ߀o;%M_Pm W y)l{:ۜʮJ巘ΖSNiZdg tcYfߺ\-z,Zc5bzeXXh<Ŷs 1LVǂ&I;p| jo )50W`S2+s5[0}L g<(`MҏJA烜=χO 6`#u94V3{7Ovۯ<FhƦOxTYetzszs}ۅ ߹E 0wS&hR-h*k=E`9))2ߝfnN1;|>\; ѫgVN)X)恡N'dyya馫WNZJwג6lVKjۆK:]6Zt:+wNVn%Yr8]ʝ"#/ޖgɉiJ˜ HI(-[w3`k:DP$tV5 j?A>Yg7S޲t*eRIu--0GY޲.TZXm}#g/yvXdX;K#4 I`Y27xZA3yk9#R"%1dNF03 4qXFI$ n_} S" oWYSD`U% Lf J0FT0~t<]'O45JQ#{oVs #4 HZX h-,+"H9ER M @؛TcQH5 \osΛ:yVqS\#I޸כawgqJ)OiF0F\׳oMk* j: s[3.?^J هq ؇ފ+_]a@ M!QZ')(]$d &* tV>V N[lB\:$UD%zF<*;1PN~ pk{&7#y~+zLR)) Qn]&`~4 Ml~u08Qif{Eb_P/œ?Lo,5b*fܙ?8-8K Ap4ay7c _3Fbv$7tW7 FaE=Qni '1wS;*AG&Y7j\bFPiHm4vIb/k|{bw*NoEga cݛ?K?_Ǐ߿{a p`\TGGQh|kC=fm UCSŚ ڜurۜr͸\sgf n@~C?q cۙ q?d端٠wE.rBa̗iwsȇ3ّzto@E>.&.c?Zg1E8I`R]nI~Bff1%]7hv򔸦H4YERdTBlQYq|ajDIHC Gk@jyD)j'iц+J~HF8*# u4X}tExdAx !? șD4fSMʧ'6̧"Po> 6[;K:/=/QrOJwU᠛jk2$EV ,Qx0aes8ܜ8}IF  ّ Li 4f ƙF ݞ@!~U0y!,q/o>Tv>䝻PSQ%IDEH -A y |{>JVg_/$zMt 4TA P  4jn44lvS:rC`1P N 'Nmݛuez@mMAI"=h*:4 nFRYK eP%2 %aHCLʣo$w}]ҸeoL_mfr ` ɡEdIoU%fxI403sc<$l`ʙ%r`)C(Nsc ef䲹NR RHO.%Yj#gM9;MlwQ5kcToSjć²+%jNFQxFEwp4+#!$ DJaԈZn8 R#."ڠ jI PS5IZRPlr@;Bc" d 9B8jQz]O`cr&3@i<#$ >יw8&>N}7IFNoNzeYrIr~P%|\kOIdc+T-lY8kw.C4Y7Ls>V~SVuɕߺt鋚^m 7NzV%:;PpC8>m!7nѲH 7|o9Qܸ.^nW6T́&7wgT @JX" a.\HSP2yR7aj|۪Noz:]õcK6DrMj[j'0VP,qjLIwKJ5FC`6Q\[u0hό޼k!#ZZ4:$NB2NIXAQ-R؄JnX95r]Xgl 9YF>.Qm|Zd=U7F#?/~5nh4v-ghPcvl`;8+J8%0 'FIL(J>&A=聗ꐰ$X tP.PB$^ȴWbn@ Iу`m(|X#c/Hƌ(t$"(G p Tx漨Cl4HZ#i)iG-a $5q*UZ)YtA^T譑d'{8sL{HzE^I9L`{{ *L9uU{4H2\5+øjw(O=K?h}U9AL09@]Ewqx.񮷅h7ǝn(dA|o{IBs_f˵襴oAT.S7qj;/\w~)톞~*ݺVZ}b!')ġOVkJ7]]$vԃ =uPmsh iD5 MIT$i4jD5 RNóƓ2JM<)ɱēZM>'xРEӮvMO]Ӯi{fn+ns若~jA1I#u&J"Z^8l(aFm`W v `g H2 /.E [/=DyHhE1@#!f I3%-8 J,YAFj:q M+oih=ѿXa+O}A1ģW2#ݱ^@ؘTD'C٨԰#ȺhI`T Zw^}{4e6Â!+Dn{(ޞo߀bz=3. 7wJIBX cC*)Zqn^FFϓ4&\MO2g95? jv#K7 \8LJUWgdUmQN[Y".~xJ9n RF++y[z?"_*k5#jPHᜍQgʲTvWWQN<:wU݆Zݽrkl◛dw/[ǷݭCJo^`1"qloh_W)MC%ѽ þॵZs((pmI(7q.nQfwM8lZ- ҪR5g}:m)f>_B">DD4_{k򯩀MDEޟGs<(%Iĩv!.JŬ`jDy6S[ۉϭOo0TI[q˛Ctf'L;?y~ru\<3Q`tY#7<M6<΂z]{7#ar5'#˷{nh巙l/ v7.ߍd>!`6?|Y7@{F(v,wF&}ô׵^ulJWC&nq āyȚm:.d[b7Z܃x鞒 Kjx@RVIU0sYP\㊎vn1Ž987ZqcI 0vn&"͜ |f&p4,dj^ѻcv3sܨb]'koL45Ǔ [0 u&tҠM|Q^nx ʋAQfI W U:J6}IU~T$x6-X-4C\AFf$PʙAM1s=̙8Jxc.ntaכg^WtB.zzݔYCLCLS,S`Mwp F.BjUZjDv"F=k":٦@!w!3z,yZPg0I&D) , 2Ri+["P(5֋BҦVNu6F \.H"t˨B]Ȼ`H5CY37xtE3%_iŕZ%,O/xW>_ZIt0,x-qF;:gxIBch.OUB9ٳBIP@wR)X1N9$8^Q\Y/t9g"Dͩ6=-w*ĉzJ(T+CSQ-逜^rjGx康15O 䍣(')r )73r-[ǟQy+; =<[_60i[%޶T#򵽙krRWGpHTܖu pzG0(2l`%x)(ϭ{O4ݪ# z|9<0{D SM+Keܭ\~gdC-m7B$™a,OthxN3*PdCK+W1[~{;]ݜxxVF:}E.zVYGyc!N%$O,}?ףVVmzJ[j?)%7UnU[3Dv*޽Sˇ?}:Lw>E;\qyͿػ8n,4iy<^"f$oA@ƭnmk0>V%Y}HMIV #wUY=/֑`$c/߃?P]mkz۟95Z|Xrz̞ŅmH??_}>y~ٞ%rDy$]o ΆhT *cBLzd=:hڶ('qw9eZm,7$im Hff v'Py`!&J;y)K,røm뼦/0Wlb(|ríu*&}5yJ *}`K p0rҔ3ˌNm #n\Nz>Ir˜(OKNЏWΓb3*hCfqLWueiTD9㚄"T~Wɋc@rcTZ:( 0+.]LU?).>{[O.$]ht9nC?`\ x͸%:BHm*tkS~,%\j۠~Vʑ;IDD3Dd%T FgpkUU3jk2bYX̒ R=)h@ $\KϹII֌ٮ7'х8㞺Pʺ+,j\^5ŶiN/~wC52QSCTDD02[pY=?bٝ#?RO;;n~IZOo~U8~Nv ~3&{K9ȕz?j~T%tIrR$S!Eƭ9WjCNNfAH@D0$ߪDvnI΢y,+by@ 3mMQWItή*/j~u0IWѴSo~twz Ljo@*Xޗ/čɂԋY+A<_=(U=9aJcIgŠkdc!F$X!*}[^L#_L˳Ac@\ dC-@QQE!KdɚX75+ :mR1B1F$p Frry1jC J%i}ȾYm~2ntuZS;F[)8Lht8TQK>\R\8%ęYGwzɭ&e?g[I!J`g2@Q\EFKQ C<&1 QT3ϲFHTC #, bl)5SJTj(gs{_1m`@ԓ61(6%|^n%ְTT8'ً@MLQJCr*jB0xP&k $qh Ub`G|PԊOIdYFRd =ze$ )ct)@l⨢8E5FJ]x3MVp\2ºcV|V6x<$tC!Jҫ^$Tc _]*ӻlrLrtVH(Led!CQzmI[1j]zcZC+T{WIWz}t Ҵ,dZ Q }l)˕l2)a7YefU GPy )*'(ٺM5AdJ&پt-7;hPfUsu2;[[@O;cDX2Id:gr3}pk%Ffs:u qinC衛wie!Z>A1R=l49H6S^ ׿M>Mwh *Vz--BKXL{pOb״v_k}aR ѳ>u(1k,כk+'V,Xz vݰ-PvXRJ O>lM} d#k\b؀Ҭ 'ZB{{00Ԓ={x(z#NvjO^8T6r]69yRwKȬP: P\#vON[-Ŵ2+70u h?kxr>NdwÓB> ${y7?h m^ U8 U0Մd}4(*EQp\ּσJ{fC+K\L9qQqXLhTU2`2k ׍$佴 u;y![ Ρf%Fjt*pC`CȵPH"YzU1\g͒Y{x^oko݂V6=:r-eG&WGmcn uIh%[᝗#h9:wdGgwn[d϶#]v5/o@pzO3wQ kP^Qݲ]}sү{m[G~JFVeyJٶ-V2# -ʢ5]~1^r/uZQyq{E2+Z%6p61tFAab5qX.ARa1~v3WP9 WОwYR^rK]Fq(E 2%PI#'F ȗJq! Jⴹ \g@E"Ykc!#6UG͓jlIֻ`͉BE-= UvB}5rK8x|%/|&0]W~E?c/e 8ѪJky&&V~z(cFbOQ(Ʋ,='xYhGjH6b);I5GP*4t* lܥ ==Ds3ˌg;t[wV#gGq֭7{mU؎Ynzfw^%IԳ@Ts uEr.]]*3TWV8"uUhU!׹cQWDUGWS]&U&@wQx|1/ǿΘ󞥐[ N6~i{4]y/})SQ>п Gc9;r# Jd\FZ=`+^uۍYD-o^R]^:A5%ʾ/mг&{)R̨d'8W'9hnCcE1I/R.M*iIjt4zj^KU=DS+~؞dv5m4~~NVyi%݃r~AZgKtd&)uS55VF(+Ed `cv/4ӷ&wK}FLѮX֚ /@JZ(25OL0J'$$Q{X-8w>B$R(I)cO P)D) `.t@8PId-rv-\2[*{y-FWd|ԷX1 7|?LU@% r1nM%Fղ6RsTJ HgnF>3e;נ>SUox/jZɷ~~Mn\7.٣~a5?u=T?&Qo8|7JN]ͦvBm=1{i놴wIk7XևᨒO0i Ӊ.'k>N^ Z#zmVXuL a!cإWߕ7$4 Ry~YX5_@}?|Ow0Q.ûo@f`R_;k{᷽x?/~B,Wߴk6]_L~99?7gm3[1{?}y3?K7Ta H&:> )4rOa]jTXawfr7!O1Lr ߶'c:En)jھZhK<>"~x^$ V1 O)%P6"P`<4M!ђ+.)P DtFs6Le^QSqJgmrkq'c8|ȰiKc&S =CRI-bCwP =ɷVS7FzeytZ[$,wSl|2C>:,U]QY|jo0}W l &R VLƷEmOrR,V8Afxzl.$k<['dWw;m9O=YUb__9`7JK0M]҆AP^oiJZmcq8nL[f)<țj ](VvHJ) լGv,`4 u0EA3X?p2}\q|؟"_?A&\ԁcl6J޺C)pMoae)tJYJn5Ë(v̊k0NQ3ha`h!Oj%A@jv0賰(+_@PQ0W *X0B`D$+h;s l #-a63;I# LadL)TrDYo) d~=!"=bH1.DpCL:ߑs$ g38XQ]}0Z2 ? jL,<4L8/`^s|ʟS4J0^BPL"CMt>HLyHEc,۠ow'ʝ|57- jP& |7Go0# 3¤5ɑZڂ- m|,5;˜쌛o|%!͞z';@7'<} E? ;OW:ҢRJYsf6IF+Mw L,0"b9:puҧKS.}d !~5nP FWWI V5ї]LCAO Xo{e1eЇN^d:nO=F,B0#BXYL^jʈhA #(H8H?=-`#AͤˮqJ>]W;HLۭѼOyU|{W6})>{2̜,㛓䛃=odAҹi?v:uV\}D_Fh]ލ~U9~3JZuxH_T#R/= +>pfQN~uQ;[ *fRSIBC^mkٴm$@UvuT=͝_wWᴃBMnn<,k{.Wwu ^׃˕TfUg=n<jʎOWc4{d|O䋚Zb>+`JB1 (BAY@= 12Hv F#Wc=`+Mh%>l4+*`= z yA 9qG"@р.oiJJpp^9ZAf5BIP,3BRgr5Vq'K&"ҏMUu(AvBU6!TrR|R4ʰ`A2*/U(T0:Z'd8&h,Pf}cMs55G` W|V>xESriuтF$r#Fj9$YDQ 1d3P"4IH5q-Z& F4C*ƓʕglْNl6=TZCD*sRzB0V+̐8 pVS%Q@ܥ<8hD00c,[FPo!]6hQDsfWh)EGRĤ1^!_dITT}hIL9Vo$-KGq,*U"8N M|#}!<FMeuLN(A(2-&)9Z\eD'Sɹ <F"2_fG4!fZ-8cQ,!L`#F1OZK5BB5} nt{*ܑBv; zW& y"hRTMfQ53kS.>*  oܑ|ㄐdw6h8W8e!Q?"BbW0CUaAE4дG}Bb5!7X?LC>,\/ ,_.a<1ͭ HB9Uǽ)6-qUm9qҫf|R6w6:5LHp2DIo}VI`j͍I%7Uaq唴[)逳JS{gF{󞘈 "4BjsN+G4"P2zEPG S\-&x#32x5sf"ܒ1p6K((0e( ), u' =jgr*[U;}&} FrKh3퍰3jHDG]BY_#sXVM!D%$ItY$R:"*6ێ~θ<:NDjSIU ipxІ b D9\0[ nأz:xlǴ  QzHLdu@GѠ0IXk98㴎i/)}++t`HgsV:d>)R&g ,s2\p~54lXTI1 )`)$ŤsZZᴴ\NGtW򜽿l76)څ0I^jM" 484@4 FSlscJ#68R <'<71OLj9+j$Nm 瓘U'3u:҂^JKXQ&.Ϡ|/zŧa+.Gh-}6 ?G-imG)io4.dv(֮9\!yVZ 3oHzn#& yq@ )UBUϫqï6F; TXkLD9" ;LEybϏȯv~e@D7N5x5)t UESZos`Oj:Q2=>qW`e&TOg>&\E@*SeAa "sR2 yt[%n-%nyޑu*H<ÜV^(bE%H|&88!(@ڽC rH# >HԔ@D b FрGp1閏[gC5 7a>!v#c#x|anWubɗ^}W8Xp4Ġ B,8B RI%qHE'把vN$篹՛:}dopT<]wv69W; r_f }V[3ʼu:0 Nsf^#o .¢P29'd7ɮ6E*u@b!IQx,11sJ1z#J/EI(PʍK7=t RGEdQ$=X+\$JGmjöY., 9Mjr-kϾeOr}H"@3 bN4:p$ p F>pW:;XLܛȻ7'M&Pt_m=K`Xv.P/l 7,q a{~y*[iw%%esWL V:680Uk83+Dmen|Vl 1 ]|ex_ φ/ !+,ymX~ZTAoi5}ک/ߌoF6o·_[7#zIs.k;;l nF3Q-Qg.~ $'cdH+:vs4WhW2WkO]DS1W -eHɰ+̈́~ҫ'tw? Ӗt́iF҃j% h?98o<~*k'|eyߣ,ŘiR_>ssηB#3Oma 9v6Bl *X0B`D$/hϖuN 30Dǝu2RHq!GV}6,g2D'HkKI S"!&"Ŝ(aZg6hxۆː"qvpmrmfyww!hc"NXj!DbcMԵZqlZDo򶢼g'}/&/}m,؅3Gmގ9v͸ ĵ &uDhQhcdvn^Tʼn4SbrO4YAô?K(N,g~H0H #u;W1:|/5p&%Ĕʂ!) *"JS(WzW3[1XRߝq7Lq`f>L4IKg4(`TA9y=nu`fe[8>]WsM˭i<ҭn.O6JSZ-z*!_R\{RfΖezszӴܱfYVfoKaje!obR-(rYh**G: 6KM)0pYRʃp-a^jF [gSZk:؛n@ϝ%W2~{a@ښBR ٥bLYkۻ;+`BOgӟ ޛa0P& -,IWR=uE+ۿHs(I|X?w jK MeFk{7mR_e޾=_?*$eݙۤ48A?cT:6 :qG]è3B&;uV[uO>Vߌ&|fw smu׽G$_n|;PSKL4-]75CѤ TևO0i*>]޷ٽWtV Z괓M6+F2,9L F.U}S v8g3OJ^Q9ue~+ct>zϟ.0Qŧa8$g PgRM/Oh47m*IӪ^O|}6C~WlJ([q7=N~ҁ q7db_dݲ&UTRmI*̤r׾2!|N1̎4*UM{?آ,A+NGK"$aQxJ)фr0Ʉ5)7 Y\q N( %&;ImڼRǴ0GWU@!-5\g{'I%2E9|CjmNlOdThg@h|KH_Zt_zW/Q!yT7=x|^(CLS'* HEDT] UP\BZ9C@O}X~ٛa)y9û7 ZO :^nGEz19s(jzi*Caj3jRDk1hFs+%5"ĕҗv@>d1@ H>20cڀsLYD!%FUj;,@)vGE`.0 z# T)JkumM*.GUҺjF)Xɶ׃\Gwնs~i _jrU_T]]*S"BڪU^xڟyr\hq!QD#IZĜͱZFd>Z7()&`zSNq飶ěU`̙`ڶ2F)͌-m!E-Bm~f,㛴p~X\?+J7̓_(L7nXz̴7jn8!!Ut w e}p3cZlh3 j-$M* %^Fpґ0хT-[Yocڭ͎#ړ~ S Wmh`|@N@B DzP Ҙ6l6Tֱc:=kOI/:Drh7Q4~,؆ mcm݀F]γYWP8 .h $r`)'A2.=C!,@ Ҩo\9Vs}VH.:DB cEn(UNpǭT@Q޶{8k#ݡ JZ CBKu2E*hg*Qpt"Jjˣ3NPɍ"A%DACL#(5]Q(D`-040Ε|r(Yt›DpIX$^Ι)11XtfI0iNbwL*bcv,МcTZJ'g*iZgM:۹MlִQs ʍk Q,Шr˨w R@A+Fx$5^O?uPz4K#"DnШ?H`ɁS#A+k&|uVk;;njx&) u @)@5Tp VZH#o9 \9EG+c"[É߬,jiIČb$8(JL;(̩$󺣶vD&…PdSkL<:%LD8T:'Q"y43SKptB)"/Ï:v*Axrg9M9EKA$^ =JT5p^W]y9zy[;* ^ѧE[q{3M;4 xIOY7W|k.5RJ*/?|gjv:;3`<:AŴyTޭ{bcqԳSK(޺ORٹjy1wׂ`D%"u&R,$Ƭd.p="[/D"yhIHhryʉgG*۷qECE9 1Ęx@7H.h֪`d #V6_Wi8\lrǖR ZjbZV{N+nioB p{~0wWcQ X,rQLm1>靝ZEtUSm|?[6je9never&(+@BiךY VlȂnX |E<.#䞘 gpEkŜz 9a6/4v"+5npd-ZLɉRՌ! {zE.ݹ6M&h4(݊FYҏˊn]&ǡqU'Ջ`c( HZ"J(E1.D (Dah5Dhk1Ǻ^:c }=oGɬj_?ޯv~Ssnzqe7EYettt`7,;I L -*_Qkd-[EZbkjߙnYw8n<]>]VwGK(#̎00oX6VGLWCM납ו^m,J^"m-I6x\,E?qet[G/iΗWl0I%MU}By3:~+&fXx c{D8Yh}۱`|;ylԻIl̬>0HuY [ݍwT ۄ~cODO+eط+:ƝvUahC;鴯Mjsm-jܝ8<q+N-Hdm [PAe5V+1Q%:-$ÐʙF]nvr d ]Jnlv^Ott -M;ϫv K!Gv(OՖ 3}-u\9 KF.9i)4 d2u_[A6nbWlCI,}Ɵ|:gK FS)EiP饭4МX# IvKyP)c$W8q ϝE (Cd(.(CS0$Κs<ߡ,_DzãeVBc9SJM![vXyT9D FM,9TI\ Q庡dad' ڜ3%#SJ@\iWIi-G(UH"!D "Sn#x'/=V{qк:蘒@ ȑedDrP'6J}qO9MkfM֤dC-`V7A9%rau$R"HlzGH$9#BIRrq\Fr$JBM\5+-)ՆD0(mN&>nR9OFI|ǩO$簏 AO7U2KvVf|+ ZĢʭAic `-y ,/nyk3U=[SQ%;R߳qU7ޟO [jA3 jf6p|R7~4ecO\,鴩܍a,7$Ê? YR|8_&z|g7Y18;2 zm+ļq>X3  DGUq,Lcb8=y_̊Jĉߦ/ ;;Ft 2ß>|S?O)ӇOQg`}N [Vۇ蚷574_k _AhCn6v1s[7rvY+q/+\]Bf He~E_o@8+1OCB4bs>ҬTPUM{U*ͫ(s;EIR 6h#xKGkBKy)SE팤m"?wiavܯHj/CN[j,M,p/!VF=h_NS8V-8V mv}KKiR>AݖD[5AK!2J5[ɨ+*lk Y̪4Nz*ʀ4[;O8=9G[ p -og"|m&Y6KG6V?q {{#K5W:;~:㦳.lr2ٚFj+48>((uTC:m|"`PUEȅKhU lju{H:Dl)T9Rm۾ msy)h57֭ K/˲n^ml^{ fP]0^ImyttPhePɍ"A%%aH!f& O(w}]Qg(  rie P"7J@H Z13Scc6V̒dA1`^T23LGzNBsQi)2Κts.Mn;>(A/pBH!B Q$OA P WMPTT-?wvRg'HMR@@R@tk&wuq8ULCQs5 0 HcgmžRUUUH})G%RDcu$}6Foeb2) ñW"R)oM+:eqtĭ8c,p,©b]$*$ 6DDe5]}QaK7iÜ9n" o1c `YD >|ξӬYK"Ii9YiGJh=(YJB['_>,]tt+Pw/*ߟOJ )M8qJԬg㼇Bzn"ЏH)B ؖ@ǩx$^5IpĿлxS|GO3 Y(N##rulէ$!A ,'BI&oCb*&hz_k[Eu6kSq5ԽttDž4/'{S/nN6W#4SΩI(arm#M4$WXDZ=]$W3a/hQ_.j0l/L2ކL!|њ*@[GVl:Q@FoݜVmZŷn߬U[rZSjޖ,TQSDZM(87d@;vykw,z\j+OtVs, `t=Pmoێg#sU L`X J fmw)-ͬZU+mO̎.& YW'? uU虍Sr<5zXiuu(d]ra$i6W[ }CJ!Z|(BD$Ӊ3 HUnh^X oߪ;aDzî:tVq,M%WQkUo\l5r\'vnOw[B~spXzI b.Dz9m;LwL;AޚkM&3om_]x~K9H9ܒcԫe[m I+\ aHj%.>:.ݺ&׺tW[~04Afz=Y]ݑ79<8I8NKx(q {K`8 >N|ܷۛ{%:ڧ4>8I_OWJ !kS2cw; RhƶţǃIzM6&a0s3ޒWۤ6g{ 5squc<pBT,55/=8%p5Ġܔ ِ,2V}|_Ύ~k/^v}V M+2,s_xC='Dħ {*'c}"Jt-z4tX8؆)'"szHT9D[67*ZU!: JՊw85Q?Ӌs׻&wqMIbY#Wa9Cᖉ쿕3 CQr`"jy sHehs4v[ȑf3 ;D.@dJj!gս&;FK ""jc_4eVWt߹5z,R5y>,6f]x1h=5}Izz>t'C}ŃO/޿?=-~zxΓ5^?49>Ob@}r'ȵ6}0 m0'^Ӈ:7ɚ߈ z^fKrsZl[oDǵyWnaooyKN3vifl#l8 rr̓7!A lቌʼnGG/ϑ9;Gv5ˉKI"v˚ 0: <Z9T%b˂uއT:ߌBbrݙbe"*{,YBOw,ڽ<({. b=A`򠻸ݥ y*%8A0Mɝ汞k];O}8pl;a=! )J`5>&[]qb=7&d糁R`1 HIVmz+ $SSKa8;IU5H ZmApI/e'px-MN>l j l_|iys2咷_>0oGoeQ,$̬[IEOVǍZTTW?- XAO׷Ԏ8svu1 ѱĦp7\L4d8-v>=[r_ձ'V*~ڳ芗Z\I6d]IB0XN;m68(VbRkP{ƤRY=C QmTSUGqTR=STטF2g=\>;N"[/vEE-G=cOծ^\& VTQJ(u}Jo;~EgM5m55l J1h BuyR*=O4'(wy*WQ?2.nRߥ%2xٻFrWxu%%}xͻc$4^"ߝ(."% \.ŌP/hP}oX*Q+'g1&%`0@2>H}*P%[/WGlʤ!DQF ơ \ Q9mJCg[R'$!&2HAf ע"YQoOU㩬AĘ5PāG9A|P NxOI:h - WL"E$Kc2B~x1NwW'nL[+M-Bx5M֖9J 16heK*J7xI&Uj1L/m<"D}?JϷI"o>SYf$8 ZڜψS^exBq2 Ft{Sv+`>^<T x5ytTN;:AE7!M;d8/67 Hcg8bՃMf/eV4C)/rLm4  %h}17o\'߅O lH91^M%oK9mbZEI;1R@# YLļsƂM'Q" ٨hgu5~jڽܐN u|>pLQP)fh=9oupj ۵-6) ).  Mk#M@ix.*$:AMb0m\.u2=JzJ]SsUڹg'cg=Q#}.)Қ IYc"ΙI"-r9'(bBrc_?.9Ӄ0Oܠţ+:;鄘Z/`mZvtztɻzt}*_7R|qzJSu뤥9$;{>+:dis^MXeě."BPHPϠPIDaD >JhVSO-TJD$Ҫъi ^\$Fލ~ u%Cc{hw㹷=KvYbߴbj+ež"@udAcٗr%"*]QsN"*/L+tF^츍̔"x s!DC}"CJʇb/Й[= ""tz1ކ4j] ȡ~(09y{LlEӛ1C4l{%TYk^*ù}u;-o M鋷r^s(]P8cxPxEjLdIي@[0uoMLNN)dM^B9  SOS׫LZ>hò;Wˬf|Fd!UE>LFs40|d̵NH#9+bσ7O>uo1p{f3o[Z&.܎ O&aΚǟ 6b14P_^l޵lrklRnq4 L!-RXC/FBt `L5 ԫd>?>'a[gC&p)UL$UTai$RbQY:jU7YHɸ҅!KKTƬJ#:yf1!s5ϵҥhހl,ПJ'Y:nJơP0g|XSs?Pte!X]YUDWjO4Piy]i+Ŗ^gF.\2Oy^|>|`-CVL8=kPdh)IJGm2ro-ZY25Ѳ zh XIKH!ı̑{Bar2cDeUf*'gDb& ``+[4޻Аdh_=Mxn0SիVh`+=>\N,lq8 7Q8<'ħp!BF9(-xA+Pukd;eLYK)q6|68&*]ί9%x%r\sW+_kYjbrmWu0#lRq#Wo/^-UKiKYl5٨% 7'N%jM0;,G'{sR6rqTxPĴJ]蔼o8Hkgm&jL}y*_j%[jDZbog饜yWJv)Kƀ2 &'/ko؃̑{p(0`Os 2h!a>,(Y(HJ7>zQM[k 1fp(f .I}s$FqiܾY zR},3vH2NDy8@~  >LFb9zi0p5EJƘ0Don#5^yA#Zj4 !|Ϊƺ\\#ިTisqN}ӶM:خ{z_ {ZT=<$:0N\4%֩P:*5c~X'N&2 ۧBйU) ` s4L h8áLXB`0ۂi{=v68A!} fx{qQŋV^}XDuOБK\|q5=@s~gۋAk7+Ywڸ3^g.8ٸ!SmMnʡ%B˺'ğU`^nz9I[.}ywƄ-Hn/tgkEFlnMQ1aB)z<\ f/u/ru~dBe//+V{P;En"j'Z6_mOd%m[c+2=z 0&r˽p;:n~q纞yH# m8GW:'1"ԅ=4;kڱ@[uݢ|2%Vc38A\z|6d{^ 46FH#Hۿݡ Dsځoou {ѝE:Q9V%*gb xڱ(c r-U1&U!8(yRkRbCiP#腐Iĥդ}Ԗh!be^-c|~Bu{_( ֶ}խmzy~o- ^阾\өB1^YK)p^ne1tH: 2jbXkY[(~?^ Sxƥb1C@qMgP4]h%̍h 9KCL4_`=s y*/um*r- LLKnIvCP=I~o]Lqh~spmѾ5݇|9;_BkQJ컳 lWZ>#KRgU6Xzdxq1?[WOg{.|]fwj̉q?Fyxuq9YYaWvI4ݎ-o4t UPMmxm,"D6m? U,Zv49[=kFe w@:E!g{`7` Eڃ%E#ّ߯X<4CiFDR{kz7X,}`:˞rP XH3 )>>=֗48̓4|p:~[u263P" 5ZzK̬_K-/'))Sgc"rh: 1Q+hgc(:?ҁb+"؀~4i"j[ΑTWZ0Lw?p,L?owgȬ3Ny ]@ϺH1q G.69F KQ0F,pϘeYg:5D$S9a pIM$l\rD f:sւFA=A AO^G@PZIrW;`:3 K%]@}(dAZx#*I^BNaQG+3K x6Kg-eFDX0H٦VJ:J>N8y6s'/}}2cG]wlْȍM"'F1,|2R!wC:{dl#} {ߦϾ\2x񒇛z&-7{o, .ZN:M@'jt*(2뤔Sޙy]Q>cJ|![i1ZW]@QmCZ{BZନL~Ef9s3wtƉ++Q{q)m̥cBJ5pu?. x?`˭Fv`E? 8Foy;;?@.&s~@03~ȣ0гP"H4FIJXJYr|hTd.^A&Ld}fyYu7FR}s-hhVoM$f׭.)?mXOvWy'KK7ּzRfA5irsrsoѴ;M1M@n }fIXΖܲcafv߭-bm%n5|w[$ f ϺI:޳`5x`X& D˟%,av)]GԝLjm< ߁Jֶ+2=[Љn|KM4mr;m`TTj|$M.iELMb{ESspe'538;6sz=T&sɬ^-܌̈́wzǒ2 [8W+433ԅJ ƺQ4-ZL6O1y 6HYBBK6SxxxJ)){w(d;tM9dܶj "ᤤlE˓HIɨ"j"\R k˕aH0].@G F{.]YvgYidT Z' #rS`GZ-O(xeZ85.ehVWku۽G6`Nq z!pIlV|]= MzҁnB!ٔǂf&&浅9dOTv~-. 1Y#>8Vʭgn|u5 <&nȾ(1#M܅B2@ä&%V2%TK"?eڏxo*{+6vྶie1_OʞIcy}!v:Hg^яt;?2R N4E/~"RB+O߿܍Z܊ݨ4z ;]\v!S!ER὜r;=$}5X :PwbH׍t6 ^fk'!4:"ϗـ:w=dp{=89ynLnlz5;;8Og..O0xIK?ހiF筒7I-I/῏oW_zNO~u'G};GBMLdc.g]R:S=TwS4<Ӹo~ 7lЍ}aB;+eͲoMɧdt w)aP\>ǕJ2ˌNSjFMY{=L/i ޚ(C>y%X"T.`!VP $Š1"u9!.&Aߚ0yyL*X!591X VU#g 7Ʉ]'r_tLL.խ'/zOfif\9Sjϣ`2!EQhT0 ph3{ \%L)hƮFzNOp<];DkKִvC+kRrI,:{^Sy0qO3re1wC{wN3»>׿oO1sim9zxnk";| Qk jgqkYZ/Z~RZWx-_830"lMxx+=iF&‸9g#hq+TZ:ysO..eA=vȴP|/yLΊcHN;kd4Hc,qstI`Ҟ8A^2`!y\~g1q8h4KB/])$U{..lMc,5()1H1 IԔ:S1G13b>#s;O9\;PBecx3!8aT3EΓ}fq%͟~Ek:3@o{.wx%NѦJY6UZ? ?>eFPj.һNT, nwoVBI%Q-W{ռ*V%wχL׷xۨjs9*W9wYYNh_˩hhc]x[9'wi:~(@:n\Fިw9,.TmXj.ԩ:K)HT:UkI, % 'uN:o}(`N;F8iL$҄qsp*9t(#ʥ:k|bDl2. =DŽs CJADeQ%cٰBʘ>G 8 r 8'cS#FkuNUR|ög944Z/%D{{n6:F<05"F fgq0,"Έu0br9lˣ&򣻻0O" LM 3ݳW!0tALO>y$†$Kqg52r8jհ}~o0jP!~گS7G7mvn󾗭ʛ|i1{vk2i~k&Z#L sc+MI"[G!%6wyc@Grg"O _F.G °*Pw=7Ht tvOZ]vf㣻DVxoI>#q@\H\􀵕T{*-Ϝ(ZA-rWEc'y,Ez*/ !MF9-NƆ+JA1wRrUY,}h-r/(F$#ňݍ!Y|; C9P&uvm'5s|'.d]bku=1]]Q'N>r +WHHh}feʊyBg5 䢶6ǵڡ (ѓF(߄nYy& y39!R% `@ZZdW͒4 IԀ,plMmbbIb.*N/ >g#Q uji(K-/UEkĹ⵹;ils|Ol熂pA8ѴоͱCGυrA k9i֗~Ԣ;QtdGݣI]0;ɭ\Z3RZ1e}vuXtx։sw|nmR9`6J3JJ%5έ!9xS"'479\33F+VFUm"vAƪ'껋9Z<ןncx!զbo0ʠ5ocy-C{P!oT9rEoɕ>4nzl?G7cpp3e({oa59fOfmg{Be>u}S[,*{S9zankV;f,Tq"ޝRW3ħ~*pYs y&#vJd4)A87:3 30L|ΣIZP=Ӡ@ Y3 PqxL Tq*FRiEkk64Lϡ?y!Ja~3?ͅx쪱 VM=~ z;y8V $޿ : 06)c>*dP$@&I},{* igi%62nu4[mMr=)pP /EdHTsUAXp1a}2p[2&q"Os):*0K0?/&~~ir:E٧r Q/O /aZBb>Ĥ`r,/)u>3(a7Q;䬃KI7<ˆ9Ws(lQ*& Bp/6Ip>af`"ٻ6rdW|Y,0mw`173ܝ`edYHgoS,SjVqX,z'T!{fn,>w V`GZ͘4l43s TX氣B zkK9hs1{ Bϔ2mhho)@PGc*$J!2dFU4"$-dY %`! n$?"J2Ce!iqPF*ZynW2a>AyOYNiS Œ ESqDvӋa;<7×V D+Ȩ bS npVq :1t[xg/=M[px_B<רiv7;ߐb|1lfE\m^ Y=I)tlfL+[KǬ!iɶ|S f7iOcV^ر;.Gq-QoЬ/>XZ12lk[ӺӋE"J~8Fv\HfLi4!ezn?s*NrO8[F9X>4dh@osjdVw;58Fr} )hϠV=L׭^y߲ͷ_YN[T^RJ.\w[5knn6@n}p9@ycavKhՋix,$x..[Y=fg3[#B 3g6Pgwtf`ZFa҇zTH am Y)qŦgݽQ v^nܭx6[}d8seڃȳq}ƛWW`N,Eم[iF:g\AkQAŻ]쎥*e\xK05fg'r.O'7%)`YdA[i5` sQ:$uu9 Fir~K}Ϣ7 });9kkWWTӮ&ռ5R(%wKR[͸^j?bitxf,A` pi)kQE$W쵶Bt;m$V S֮vObE֌X:d =ZeM9"(eLk Q% `-Q&l]Ut|28C! u$F B1r|@xFPVY6x{8Ğ[e$PhmGqGm$u+6m*<ܤx;໸:^ PX"^[[g$%74*Pke\AjӁ&DžCSUј#%reKxsC<BB 0R?< kBaRx`}ƺ``0W5"Ir) S0i abDŽs5y}{0өv|9?WhZ&2{c@ }p+ pS\r棃LP)|`ې0PX !m洁tdR0'%N\RgI3>K+#pn:&OsR':4I|RE[g>#Mz~-2NdPfj)! ~piR,c|OfeBuXy??>!һzVjx=݁uq֛~^8pL~J׸psni(PRS 8+-$ .r'iٹOKKҫ]>2OLdƦud-K~|;ز^ڒTtXތ\Wd,Xpp9Ud9U`[?V=+RE2RV>?rq^qN3M[qSM:?ӛӽ?LrFG?Ïӿ>}#?{8XX`ՓEI84??iYijo޴rmͧu*k>v+Mo 8lg;S&b&pw=?BbL.^ݢ﷤Ipg!U<օXT&b ?P@8_=fƖ%h&x\[ob=9kBff vOR9% 1ygUy^KYb}7Hzl5|ݧ#y]!x6*APKJ 4&H2*ic(חRRRUTiRĤb$FZ#+I "L6DedW}|36ޕ,A#R@J\2 *ɂ19U,L$u3d&1RQ=r6 \ std^%v8gȊݣ̛!88ȳA?^)ϗyz;^`dhHL"plJ4ȕxNӘ7,[X.,e,%Q`uP,n~(~0udQ4 &Kt{V1>5% 0LXH,Xώ>+*N)7}뢑cӂ]!IV3(h NIt\jp=Qrt ;Ok*A]:pG@ŬZ{6Ot4Prޛ錰tQrmՠ|Mݛin\/t<}:s:9]>M)V˯XTuLer{=v]{x#m]R4^z6:^rۛk>{qr>ӇǩrAƍƯ0i:5H/lг&{ |`ve =^)H+Endw\.&%V[]J$qY3Ct&sީ˪&?/zK&19c (G0&@|X}y弱fTҖ d,#UkA皌"T~WC1@vc`0l<_SM5j˨Ѡ;J o\SWMO6ɦ貥Z6#z-Os'cqtr^Nj/^'^W~x@2)DF |LLd(LlKgɈ*OAbhU >YzPs-bRmIht${Ղp RRʪ[XSy&Bm mAK4d12H9̢=v5qt⵫iǞxm:!ؕRr B >x$wj C$Fh)FNG~% JOpIqI#dB\(:d`bQC5Tvc> 4N^_WMa;/+>fu]-#w+EK006R\c^1<:I?ӹlxݕkS$%wY]o9W2-a ?dݙI,&xZ˒ےvcdtX.cE>ȏ#,eq2"mQ{4[Eg,kFY.#j/F:k%/E^3.\\xML @%s.M>揨 @j2J6p`~exUZ{uGj=0HV+X5Gds; v4Y\z;Kk[okdj6ozpP`AUUUR^ \Nhj ajpB֓VzR5 z髧M,dWYZ}p47 W:p foGS(qg󩳺nvrD3[IjâG/g((-irF'B1*(L ʏTf>ju,(2BK)MoT D \eq9n{ zp%QJ\񣁫,.?8K Ϯ9{pq gz7헏?p6UqzbkiTt/ cIrKUf4"nmoýpL\|8s8i{^$-V:C">sAxI:XLfB i$YL0ǃ"/-$JpdMT2+XGhk15{&Ίnn>P|tA|Azs۾ egQ,L}>np9,3vm~Ϯ:;2Y\Xn䷂h,!"zd>*dP2O'eY 䴳FFC4Th[mMr=)"qPG*yRxm,_sy+XǷ U2<;#0ıfN1+E6%pA`ǿ[ 5hƪ@1RS8Ex!s"Fsܒq#{4)^3ǤP:mWa9*x3[lus.}qwE-4}-z,Y/ܝ! .Fm`,HXY*c"Sh%S"2` U̸ "l\M`0D`%a PO s&P'TF q1`!4HˣZ*!`ˉШ g3 sT9 "[A6qCf!ϜV[FwzGR/W0]VXP-Q5Stq+G\8DX! Ud0D•W3k,;m.C>*AW N*Hko99& LVj7oy2_+}ٚ)83rM'I#HFl4$H9#JgSr'p lZ@4ͪm?7bR5*8d2LHQݚ]-p15:@i\jR+$VxEoOg-^zy9\ uBh =-@@|A.dM(rq_$V}&A?ή_?vCϺ^PG;Վ[uVB3cE#i5!RHX`R_N8^ڒJF^D H9b(Em1P8p^8qQl)D6qVtĘB کۋ,In}UptobAHhy㇊2SwR*G@hb 5=)R@(!}XETR @_-sDJOuC;w}\O$xHgJFL TX+^7ZQpED !Jy(!!ܺ뱜P8q/b7#2*Q0Q dd! D()Ćpm9JD_-=jvپi+dMJ0o!댶.7 D9  XG".rłmYH.jG:J"pGp*p'QlS\iI6$OQG0AǠP W\#OqkrCvÙ 8OO+Upe眬81Ns3gϿ|j]nwy[\;}D͉62~іkwḐ_jIP@>"Sܐ %9ܟ~?^Ix};=ެyqzB@yt'ED_3GUƒKkGҭa|ĈGu>|np941{& SMDkg:r8SDnݠ;yo2}/jIʖo麪V݌aLOOdCJ|̣w^0c`Ap۩lŭ;YjU_&C0kK z{uReT*qN>>n%Nǿ }xOo㧯}p ݵJ 0t5k4-j7͵Xi]gG] .W|{2 \@iznBa|Z =5v&\Aq_c-?ObD*JKOpP Ix(:oֹ%38 Y.}ݮ|JT$6YTbq>Z@4RP(I>mc^Q.t0F2:PPG֥Ȃ !>2*lgDCl.uj9(k ڊml\헞t:/U~$j|iD9(7T] wH,YU(ek,DÄ&zTma$^ڻQiMdZ Vsd*҆RRi4oy qA-:]@!Y0y!,q/w'=q4j5CE6;O/qViLJ]>&NF}487H8\DfTi. %OSz*#R?gߓ3-%C dB7l{I3/t_HPL 8&R"Z^8l!)a  mBԮH5 R;#Ad.Z_pL z'Ϋĉ4K io1wX>T/1zM}Ώc<5鳫L'ʋTJΉIFTbM>Ę@ .1}j F)gkX:MiTcBi0Xؙ̰C9ohA{ =W!Lnf*o?n&J;0ES\dp,rۜp~h~Fߥ>8{۟#pٻ8r$+>wIF/ 1`^id9-m`}tJIl vCTe$q^ mzqǒj!W,]O"TzyG_^\w?~sˎzvs|_G񗟏)aޛɻ+S8ێ.x¿=I1xd 8v8'e־ )s!)Y-_˖OiZfuBY ,&<~08=F9= ٥4攘}Wuc-NKJ)t.ROqC˽a g-3 2攪# 1&Z҆ -:SZJ!F8nTJB,!}no!&J=ͲE3]Ƚ#왕=2sgOnKcr:0{"X{Ҭ*zs2渆H3yu'*iC`eP?&^Dcd6s"\Y2CD6B`eQ88jف|=SeTK_=gtcOS 0DY Ql9D/[Vc%.\ -Ϣ KI_Nm !~Hje[Rehpe$R{<]IxU#ߺ8K/9֍YCeAG=[" Ň\}Ȁak%I39!|F1x'JlT)՘VηeY~"c;=yD2yltQ~AwgZ/.32{"1].Hx^Iq0PX'|3k`-'6V6G\'RJ2]]|ውvFTv,TW4R5G|z^Al5]AG鬠{-? a&hcI@J$!C\+j\TY4S}CFީKfRfSTwzىW4gsEe݂_,s+ձd+}VYԫ%Cpl^$B!=5?zu#?q ϑ"^b9lDx*~:VDOެўEJM}Ug19Qܢ>r!Q([b>VdAzs6Ju_9ɥKk!i9GGEѕNc i?|4W/ [z ۺ*`?in u{7KcCび=u?vV=4nar~aׇs~nY >4 ^R'}wb>;z/xjwDaL,AjWP`ݟ2:+f'g r·9 x֠G#w }p' qrj6GWeO6~1p$#joj7'9pm d#e#5=PO=!VM}Nәq W bh?~ج^瑵r.U04]Qe\}T_&ɰ w/L  aNRWiwraT?y9VP)rM|uBޤd\G>?[K%NJKMβsۂ*k( y$gcyVc{ 4O!znKWZzC-һ3y:^J:Z-T87mZT :~,5Kc@1 q޻tEڝ &Zjt n!Dosچ1ѱgeu}.m,gpG{83m㕽%84\t=6d)6;\6U@+cB97hDP1rZF"9:tp9;_RBq iu ?_o_U)Wam:4. ^ހU5.;<S1 ~F<1XG9בJzCoA%#@n*$a}ؗK^Z n-;N\܍YsE##$O2-k Td+&n5䣧yQI2;K5{ErpR-f}:(YQ"j:P4ҖΏv >ud9C]0}p5!_6\I ju_}Y8R4SCnq xN<:{X*wp AJсk!N38Ł]8.R7`\u<m6%2m?"q מ3JԶe,qOYd\hڹn /#-q6׻Y=v20 54 $!)وlh6e4kdj70(ފ*:jP2 P,94v[5Y ਫ਼XˬC@68pTȄ>OD3d_V[47afZ*$>7FnU{]!{ޯ |+aW⠠[EGC3!v\.I%;l1UDm" 6h 8J7͖f90J8I9f_xi˲K/EzdH_u~: ~ 4 8%7KT2صwpH( +]Q(Vc1I .a݅W\#f MXQʥ'!&_ 0 0ö \~sAM$4t#en3rj!r{P\X Xt9*Y3n Nk+Ԧ,:}x."'mc%A|Rc!"F<*4I)D&r12oPDC"-ppʶGkq`t\-Մkf_2-lx$^ 1u|tPui)`uWw:vISe>!g1xGzSXs  \%_<\a4sssse lc!YBȭT+-yV&w4@0݃ |@<ݙ!&AHk.;mcm\g)H,؋a,GKiXWc;gpH!c{9ňUR00ҥ-T4@i/s\hl jYoxNP_44WUG6ĐX7 FKm؄~<|e78zv{Ӵg'oƩL. M<@dH=Ќg^f F°?|wuif\kŇzQ8Z*Ș|U۽!Ҍ2j(ÁvCPB^"(lr BŻ\ - Jd9]T`=@A2:f}`#z;YoahU ?ry%YZ =)) oN /7t-9C”E:nHY)#Elgu+Zq@6kAяL nL))"9fjsr] JFCs&=dh!' J1j-#ϯF%!Sx@>FhUX 6 -mHei20`gm+ B+kBn?)H 75>Tr~jπV# Pu4'9n Biml%.j)pi.D|9kSd`&%4UEҀ!T V'4iE$di`MRySWn9B;stGtk.8^NN?}[ 7q.ۦLHSg!N_o~ӬN!wtRNCN:EaUs#+w~__iWcW|Q *v*,(aq4+]Aφ>7irϧ"<"OgGr~ E4ֿ֝euyϩ[=>h\E}*b_eG?ZPq !% 9|T;$s9#֔pհ2$:xg*..-b}ame<Ɲ)Xl?E~O)rOY6,.r'[P2AFmo",;fҵ7·QTϽ% i* @=Yא2Tc kO \6r!2.3d)_ y#b,,XEʢnfszbq gzl7f^cޯ2jlMwHe(`>L/0 ,S(𴵖%$`d+%٦mR$ Hܱ&J8nj@ײU\N˩g.2½gɕ*f6 163a*㚠Me]:- sy(V8 ZmD[dGty'>؎nUr9xT\T Ȑœ[ %f6 >p>#]5ժN}QYc*`OEh"n6QhP`D!8 uhg 1ZWu9EV<`'|r^gQT+Ev}cȂ$rΕM)Bo EAC5JUD}}Bv`~jCiLRu#y׮{e?c7^uHB$Z=Pi!'$B-;<MD<s˃W?g{Du3㉮^0qL"EPacGKшAmA-%A(R-E.]8nnf]<:0^& ^7ɊrGuý4FJ)Ih0Y_fn~R6JӠI҃m&2Uےʌ:+d:Дddu8RuWnޅrzof}ӫq˼6v]=Vey푳[-?v3D:BD х@XA+ A8Ug ވy<yy];wsǫçhFh&j&j&j&j&j&j&j&j&j&j&j&j&j&j&j&j&j&j&j&j&j&j&j_MJIcVBGcpG1QkLH6jLG 4&pt c%(7mdkT1/2*7A W豗Ґ0!ڐ+s(͂׎(aR )sL ժ6jbZЍvzanhxkK]G%C`uRQfϗEW]M2:t~mJnڤHfNK+WER/\|Nnfv;.ٮi0w6pg3{|w%ܹ}<W˻ܵy1h{_wa 5]ᾯ^vdwk.hВIYeÌe[3ثX_NseԤ$FBU`Lb[dBSS܂3ͩR%! PE\(+/S%ԯq"\_:`5 =%oNB)xm@ E@+MaX, ɦo[%Ǥ[ xt렺nVЃ׭ZJuQRX{Q>~f#kUè0 |chS,H$_s$H,I?=]y:+zstpѤal&w=\};z1sci{ZmB6x&"ӎF(M%L&Pe*VՃH,fc:5JjS~o9ȣ'W@@I?48t K-.'ѵ[ʥEzgӎײVwYy,1& R r Ȃj2*K]4!~;[,oQv@=K4C+q8YW&>SI9a:f'&3e,GkJ0%8^"X_%b 0hiJN~1:o&i[JKLO\mòDjz_n1wC @G"V$$/t*F8. épXJ2I3o[I~|ǙUٞwkQ>W.[5YnA9N&*03&vVPNc>~s7./Ro_nQ{bͺ?8D|s%f} - pN:a䊜CÉLҥܞL`/O6kޛ(qXO] $y mQ@Ikarn~sCi8-ՇӲ>r1ŞiN/mʵf}m^k^^k>|8^^~HTNMEIglٰq։sX>*-cBꭵh~wov{9[u̮^zz>*0{PYhfYK?̃gٕMp4zh~AY$:$']v=u?ƲǬ_a4 zbYG'w_[NSE#Vj˧NʹkZ̐Yp֧_GP(8'7kjPcYar/a8 ?_?g?+Ο)pU#]_UKQǟhQţk/t͗?|eozΫ+j &Gͫ>\hJ`j'AїayF\Û/_L7Y Jox༘H*FSV蛻bJA?e>yk$uNGr8yL9z4[ӆ3n숖BcZ^]݀oѿy#8KkG6sj)F|@Z~xO }aPTc0ybH덐2IY,܌ ,1Hi(DFLEs ApN5i`0}o1hG)w|Q8Ax!hx2G VeN{Jɔ!G#\D;5TܑV=icB]h{mO.Ǡ")m's^] )csM͍Kŗ3_F.2%IwTL0Z)R%if}%$4ExP|a@g"+ URXiy:MSI9ўhPr|ӨOJe<L|8 p0+xۋv6yZp.LᤈY^E m"\0`eiS'GA9C$Ʃ$e]f)I楤gf8&QR+J2inZX]qvH'MЊ8}cx:YYAb < 0ԒZI&{xDB2 |s2YfA ̤o':蘳]z( QJx/x~"B34x"L|q w;N}jvnMr Jb+fe6&b٤\NmnS0ڑI}XhT#MfoPjAViCFX#+LR;emRp%ZDk9c:HvV n֊~ա#Z┢IN/'e|uG4Q?н?F(~QcQ Y:ǀOƊ'C!UU0P02gXLLrG)7w(Ej<RlU&"ŰYM1@91j*g( SlklUP>;Ts]2=4fsԄ<5`RhTAw" sJD z{` && *Nc4j?XAӅI,C$/|MPJO[-KE!VlH++"EH $! $HQE1Ѥ8(QP4c(kC1Qx?~fo+RO$%D/K0++%V+@lUOnEi8] ^Eȡ8Il_`cp&h -VIa ]Q#$@Y[ Nhc]Z}t4C#4=.^,aMxz4:}~#Cҡ^@G@mjܯ} u]<^k0Ʊe^'>ȣgjEaUѰ_O2;[k _nO:m f2Kگ`rr;>0þF#7󸕥 7+KYji:rA tko99$eD}6X B6$e6NWʿJҊ,6af芖 l÷8t s\-7데) 4$}A\\L+ b c1$` и ?Etd&5)V!̨ZtBFٻۦwuA`,%l+dʬ'v xs§Ƃڌće?s{OpWgI QjT Wi Eu%"5v3rk츘Ӗ9݌;Fmw9iR!XTBi":Au2bb&FY mXc2\2RȰ\ c]CY$2pd0˞HTgs?_NyUhjDX#I#NwY$r.F/}b/%SJK >WOqPCChX'VȢua d{J;@'uR,5b3rk˷:^\n3i^T^⽫>d &RIvS)рV LXhv 1Oz)h~ZqX!@96 )7E?>S ¸яI>urŅfUDAC.Z.f]ho(; UpCG!咥k9%{(`bcR6sCLBG( ̈K1Pҩ`tL-F[G"g?vFاl{%kuWvzLUwk2MP"vi&x;xۉ>c =:b_ֆgoEKڗ+^2jUxk_FjfZ>Zdrq7xU |Md+i7ξiԙu dN'hb2uPE=Ʃ]S=A=A˓żсg~߽F7@J&'Dh^b Zcb Meit:i QvXJ=|g݈SJ̳Z4|:~elvZpwMzn~/.;+~qwx[K}]a26W;W,f=ٶX3}&ϔnnzzru騟9mD]Ȃ:mb̟IYTesOS`jo5==U5:IڑL&"*%N0<OSO|)Zp/U k ($$k-"CԆbLF,ց/yZ/īg՘f ]^7>)0⨓,h&Ym3{g۠}vS܆Un<<_(=3?0;_C'8__#KZXki>6^R%؉T UjSiv${y5^/~gKS\| uY0ֶcLl1;.U| ;]7x:#^=e–} -Kkt:?;-_^^FH^LqJ_=E 1?̖NBΆ0p7*zo?Wvb1. ހ=c%=`;1,tXiYv@cWTVA]pFw{&)_2vM^"$ZiF~ g}r|Xʎs/Vkr@hkS+&dk.1twX۵ō1 5C {m {m%L(}uLgԌǼI] *ji)|@ TվqHDQDcMDg|LVc1dJ-I,J!CE\-YӶ7j-$x (Zz%LQ($ιTd5y3rv笡[T@N?4,=r /o2obĬs!Bƪ<2m7;|*،FhZS*3$:$A" ʛsB18GP R.wy <|?zylmpw%>ILpƈ t";g:+g>$%盢d01! DR98 䅧fvN ,dX*5j[jC^|]_OrA3 ̛h(KRIW sBϞwV B6NXevGACgz1O,٩\]ǃ}? -ZbV2sh4h-cb aav^tyؿvPiAJ%R+~<'W}}a~]ewsvrz5ҏ_۟߭V}FGEkw-O>xLSYd.ol'>ȗn`T}2'h>d6>T'?p}i1NݬVBO̗hRƣf;fgWD'W}|˭ɟfc׮lZFn^ˍˬ/,'~Xdmӛ5G_FW[U%l_uV)U #/$.}z~wrRƙ8'糓L}RLj'5l4}|FÏӏ_~O~B_~-Οxr,7IǓCz$j4Xgi{] #oXhy3;JK '?_Ä2s_.j8Y-\"Wl/i>EO_P%Ý%U *_AtG6Ͽ)&7PuWm,jtxN •bYE{S_Ĕiヨ祐! " Nҡ 1E]l@2pI#C%!5Cd]̒TҔ ?dl(Pɡ4_]_<}7VhO 4aXəwZ; lg؞rsYK`g-yW K/Al-f^K9$ѿ3 rs̋'̳5MIQ!6}*bFoujJHBX{.*M0}abTa.țWhD?X#c!RY kDH% z 8Ji'uRG ej]-tL75iwS̋=jxl<1t z?J|}^8g%>Rzo-Z90S9`Q/{Mz A  i>RŒ-P I'r"&[E#42 IH)VH\wLE(2V5jFn: <6cuC ׮>%=KXXaT]/2S}a8 Bdvb0D9AEKEB!XP[ݫt6>l=ۯّ-5(Cy5P |;7/xJLjzSTr-)*菽7ER) 0=L|N^T( t;|UL %b1o#g-#-&!gٙﳝ;N_m<|e?vp _1T}suu7`B䣗5WN ڎ!킓7\}51{8OBtSUV|}SEDIz$+JiU" " Y4B9aP /Re&IDes"'`#x<( 0=Z4ɵaAHmX)i:&($ZG9;Z$Lt$Ecڌ85xg^'^  %( Σ72;( y(56yP;!gD2=;T .5euzQ>]+r,{ rD]'3 B&$DVU]r]u/H$x]w:"$=Ӕ(O7dMLRI.Gh0 I A~(.Sr|E/FȦ*/ P ^lFl_/jmU+QS=hNqYyZa}96c .R(_tL)dYs!VgƱ-rT,&4F4nJnYce~]ϵk፾ml _i;0 ՛nx3Ѽzbz0՛-M˫^|<V?Rmj|fЂ]M y@2aHLjd47oZq-8c;cX@]8W[2ddB Wa"MBǤ 妣u_חB8[!pBmgJIw,Z9TEC&R!:v3@x'dI. \:C)w,KTAl$|l1 \׉**̀s5d[kB@ DfAu͍nċg?]iq5кnXA8AmPV#1ytŽZ˞bS"[)Vm"ZF).&(S`œc\_i`Ih)9`=טpvZh HFnэtn#H,43 O.:'~1-فK5|;[lfW_gw59Ycdr) w˔R Pbs1.$e:#<*բ): A0`!3$48o%Z|x+or fj3e`)jTpZcq5 !q2Ϧ=J}5D{֣5,UObljʽ}noF~lц-ɐ/æj@w(mabƟ^Zǿ `/VTWKjR*PqYGTAM1XM9i]&|'WwLdO/ >8zN1#5 =vrtɡU Ƙ4 E7D:4YnOͻp ՛/x)+[ff7s+1-r+R* MT` &r14̟j6;5z?uͺ2095*dVY8d)%(YikF)$EiLq1LV^v?^.7 l+ry[r˝LhN!`*Dڕ'\(u!0 ['A6x ^LU`FeY'%{]LjO*rFP4;2Ȳˑ2rG2LUh4ooJ) ~FMuU>JYmX&Zlġ}F)]n<nd'^7zڅ\X{0^aQꍢZ-HeJ!i:Z-5'Y s!`?:#@8&wP-NWmU?wpVF ^ )Ku*'ة! Z7+s=k3#)7wXij>WB1nq `lVeqpu{>)TwQ4IJ&Nf<[[2"1FR颀f[B,`ˎԁ֑2!(wmq gqYź CMHu}YC RSʙÒ$@ =alG èڼH᥵N]{_5z{}[!/?옖?wsD`Z^[HRy&S!4ĤY+Tk?.uM|K&.u]{-=YdVhHXIEg aȞ!kwȽP}l$n?3_Y#J&fU#31eVD*`46iD^3;cB?/q/4kGxW[JgRn G'2S>@FxY[EzL`H d%x'H$^!k Ysc5u0۬APàj R?ָFřG_˟Mĩ՚=#M$yͷ;*["+j%:M>y'V|;:&,-@Od~-j{:@ˮ@ M@+-%x)̮L5Z7O,W%>38`l c%n7]s6f;?92>7Srx~BqÏjr[i?NoD>FƉl;Vٔ-?OA/ǜ~^|x9Ϗ]W$fEvk +zEI%C8~x[~?߾}w7~Wȱ!c/pc􀡩y}]=w{S}֙͂*'?]<}7]Ě!q mw~\i_aج柖cI\]E?UZ }݈Ylxdun@6S{]1u'!7iЋU8Iڣ-E')pB*ZNDʁ'1{gUy! kkZO"kFy֕Dl!5}(Y$*ERbXL25J:l*۪3xdnV{Tn> t5j_z%cѐVӈu0Fր{#֊؈6b@i-ꂇ;^(wehOt )v1@FfL,I0%vԌ¾FR.1B^K1&rQiVE+%Zy.w[,BqB#::7NaL%l-(80%U>'G]GNgxӴ&c::LE=@ tFIW gD)`r"vt{5Glvi+QKAEĬ UZb"/:Z~O )b9n+cCD[|! Wm)@N`5DN:I֫!SF ɒх2%^j-SQzcye?&?=.*n6݋gڽvWO3~pO2gmet t 4`lKۄ)^\-I'Ƕ/檂g✪GqNQs*&$!HIbJ~jQS襖$Y% >ҼJL>gJM9])ʐz@Jd3} IT&q3846?K}FO$鸑y}f_F].Y1QX-%= DNWklIl݃]5ƁTQƉz>5k@q1b%_a3RIk uP)6v[QIͪh,e9LϮ֙dXwS3)y,/ʎyQ8ge+IDR$J)9Vdk@vgVH>zl@*]S Cy)x7{?]IG_P>hIlٮ*mYQYH'95 nhạnpZc톗XFpdmT)uTRE5RiXYd24Ve B߲O򣰼 .(t fvdv#A҂TR*BEJ5ۓ&1Dc^i%h˞`v &LPk#a]ty̡=JF/Ӳ*}y}nwi/5 R##.:\4zy0&P͖sTeT.+NKfё]Qs;܆i!ƭc=B$+1d,WDV(^!*FQ 2&S§n</! 63BeY8as͂u.bs6VkStg@cl]3oD6dL)yMe>$/0 kZȠav '9@*"+O?wKTN(S*@),W$HJ*מO@dԊO/XQeSJb`UHYQ^F6$eෂTZ(I#qtBЙL|%}v}i+Iˌu% OVĐUֵp$hGm."jG<͑Dg;Vu2}!8\_$HAEPH-%M2ז 0t$eU$qp/1z8_vfޝ;iڏ4¯QkHԴuB˜eng ,ύ{`zܸUqCPӯ883=vr@tUUkP誢}qփ8K+^L|^DY5?ٲ&[Krӿ}=a&<9o/?fr~zhs^}G5aHN)}َ4_ +\tEk,]QJ1 d 0  ]U#;]UʍtJ5$b+\)BWUEiHW/p@tUk ]UBW`ﭫRK+-2C+pbWL쪢5*J70D207z0tU ]UOܟ*J=ZW/$ w8UE몢2ЕF]Uz8`kP?]U8ZW_]{N=mͳՃgO\=]=ŞYWteGz#Ip@tŀ U+q(tUjw(%ҕz̏ 6_lf$} IdEb:koZq,<7|t<'>mr:7O?..qq~-~c8{r"f֚788חVAQ\s>ۛ7e1:抜PmV/r'[U_+JͿR.Sw_ɠɸ24nO/)ւ{ͬފ?3>B#7J&<645AjO/L{,jM"\^kiyi״m%0ҚECR㌢&Z+NW{/K=xr.xr}*J'=9^DJcCW BW}੢ہ]Bv@tUp W ƺh;]UfDRR8Nh{V N%~UP~x}]iV]_mcĉc{k^MC`L㭀t hp:g秳Y>?7kΝdil#7j&?uG.}d&WɄH芑Ki{X!s6`k-+)()%Ⱦ8꯯qG@7뻴sNyldQ\^1}%l|7gͫ~{ ,"?66OgzH_C~1Ǜ EyF_%SCCYz8CC KkN%] \s}Aŀ.y16T▟ %oV@IWEk/<|#̆d\Yde_c4<03v=Gf2 2b\yw f0í]Eգ[snY_y{m%WʮT$}ȶqժ9A&| RxT@qBAxi[1F: lp6zs@@SJx-v\¢3vTHgT.J>lꇥ*%6v7`~ ?>N?;^Sb Ց`"8 aUV]U8Ǒ^,+;r8W]2hw6ױ 5xx vͱ zdG#E'CGDڄ($Ki*CR, nGM^n^8Q>7(I =ڿp`'`k.SØ8P=tHbaJ!ZXjV S$V~3o}\/c_jyt4 `Ҟy {e`0ɥnwjzwk>o%ӌJx_fkXEiXAPyOZicB9}"9MLϳb*aK^ =Ub3A= V7J娘q&7FaQv(#rD{0}6a Sk10Tm=e]ozۆ4Egq1ZcN\ VFIBgR嬳B0dcSohr` *_EĖPF :1#4'(xAci \3rR|p~=_+(%x75,`bq]O~-n_°V {i]ͮ]1=^}*9^6 lqL}~7zgucQ,3j-"8 \cOc>o{YIaA3v"{/(I<ȜV^(bEiD$PHCPu SKS2#B.&R/5e#тFQ4` alg쇱%LYlδgڰsѷ.и>6Cgf~hSyO qq[W`N2(<5 M=#1h PTRxIpd\#qI۰1VYE 62wF-XQ9b2wC;WV l2džg/v}iBh&dMQڒ;޲|&˒*Z+f _٭ kqr XgV" k9IJrT3v*QH8R!x%IâfA9ު5u()!D!e!x0|x?b +-wC2Ow;uF/k˳"pe{N 2|vhïGR_!o󔶻 Ob/CiXnmL9BmsDw1LRw`3,ey1t”Ed?Ovs=FHP,D!"s.ExNIt>F/|$,WsL(ƥ;Em&#"( -J)jՈFұ};sٚ~%g P@E2d<5|oKuC炘T J`$vE/%UԅSO`\)'z; j\)ږ9>yUZ92 [/>7Cz NCO b 4tkݺdz˖Ѯz`ϐ&M*\S}Q`Sw=AtA.2&O;8cbC]EOwttN9"a/lC$cՠyv#_GM:ksj֦OxR P By fJ a%ќI)9®'I+1CA1Ј#D-LsFj-I,HH{ \*`/ wd =>"k/%%'%bi1g;0l\"5|gF^j>L&-7sBv *0|;f#QOxB) <0GF\Z5<Qv\3-Z)O7g2_^4^:[HSD($$nA8 !CX 0GƼ%v\Bj׍'8{T}&*hEq YsC' ͐wzSa.9GJqךkEg2AUX}b] k-8$0G-`}=.xZU$AucK3j+5qF@)>Pj[ړHz@ܶpl>{8\.G{&yr9YxdNSOMA=SPhCv_*fY]|vʼny4LL,T>'z뎓Mx*VéaZ%jmI vv*X7ebģe[Ê3-8xϔR!6{KCA2%4aǕA18n)QAˣdFڻ%:qKgFC^_MGgIs/|4p5"o /$QiL9"؂ T&FWz($.^fEL~ya;5/,j]y,U³UMJ&a(kׇ0w^ip"бׇ-VR*2&AXЙg\A(2r˵KpI/עclJW ,5Pȋ/v>o`f.Vh9F!$x" <` * SSA0Ʊ@|TƼ!aǥ&LI8F Ǭ|h)A@)Kȳzg촼3O#@Ed2Z$73ה~ܮ.vT+H ewI'# K:P#LOvAF`8/`Urʥs v{$6̲T^-,RHitL*"Q* ϸJb,aE ! D> KJ%Ci 3@kc Q[>X`V&ֈ*C1FÏ>^O#]ziՉR4:F7アZmn3$ i9B\scQ* ۅtTY1.1pjє l(LQK%0VT/v'(Ἵ7g '5kku, ):X84tN0FSi 8]~6:U`e[<+uRba~*]~ ʣ et${3"yaJ LBS]Lťb.LQk;(/XgճfUtb !6v5,R;Rl/IB>/cN~``nK?kLMFkc^޽ίՏ2IDYʨq4kFKr`'`/0d`mDUyr1?KWϋ'?̯>;UH-6Gy.3K̮8$f^·qjꉉƞuS7MYOd}'~\H?wD}GmWvZצbeE:v Is&3Ha04&s_>$*d b|xF+߃2ß~~~z/?~?|{q~30.embAY(4>]Ou5WwMkӵrՄ#}|8Ҙ[. J?~>.EHgOk3"%u`qdT \E ]R(9K=Ҍ ܗ;Fk=wGXqi﬊7n$ USJ&H&9Okd.-BKQ@KMo$=yY|~;'C(]ÝR.h:OVdf V?0EF3fX?p2cJ-ܜhQid>#:pm:Vi]t[ՠph R?{WG@r*0.rY,rObAH[Vi$ySydkшFV+[Tw>.>&q] &f $W]mRD&kt^Na)9w/B߀K— {/J՟3jƙ$C pdB7OysV=~[@J5^Y?P8$8Xo%HCH{*䞚'#AĄqJ89fVU]E孮IeNrT}YNk)cŖVkp,uu&2syB-9IY[bHf.B,uNTj1:$y[|1r&O{goY7ܜ; [AHa;<2@AOYs">M|ly J\; 2{)2TLMzA'o""5 U|_:Yn`Rp&ԦA MR L=1FWSŒzA+SoYk26ӏ;]#tA/g -e};&d]BPD?Y%$L&ϔlb}iUߥpN)):Ơ2QJ))̠ݔ-QSRPOJNʔj+á) 3 M7V7cd5Qc(}W[#iC6(t! ̬%/{etۙbAw̹:?{ZqƽryYH`9"g^ci(bQ+*߽Fst1.!Vel:.|`zW[7Q&57kH+l* * {n\v;b!X&,|T,\[Ԯ1 m'GɫW>x/ 75YXShScJP}ͮ4='_- O30CD]=Vq҂Ƞ0%@hֵ;vʇ  [s*;[黗9w#v:=+!澠vѱ'jB޺P UQ$C=S@:Do9ENY66` e-GmWҨƺ(thY&3'XEe |B@L,;Ps7AJ?5>vED"*Wtʚ!\J^,(9GRNE$s#边ΡZR!ԡZAm8<[`9K$ 0:M-;#b7sPR~_Q't}Z6JvE쌋0ℋPb02d2$" kqT4 mL7~z}C 7H}_9r+1 i7 icIpSse?:}Dv3^CHK>=ʧf|ƿ,v6&o~#\TGr/gdp*ǖepqVf.Yq]WtZD9- Ų-T`g#f:?̾7ƒIr`. c8L%(ne뉇.&f9-߭ʢ˹r2SذDz3aqhۑYql󋳓|~ßfb2g]Ѭ ͤ헛3ϧgkS[ /ْj?w~:k^\5sjUҚ  [>r*{v><3\\z^~V/ 0ծ3RJ{u?jr(ɻ7_ ~wqĮEʼ1F_Uᅴ/M3fVz 9gpA`#n"n֒wnV0+ilZAU3ؼjZz)pլ}܇U;+D/*n{bઙ^ \F}fWF{P/ rઙkKf{vլ0k+Ҟ.aZ>Ƴyrr4IL{Y)ϯ~:.g1ḭ9yo?k&FR Q`sD^>wwv{8'/L||zq/?woI~}+?ǁțm.[_tUc=f9xL~50Vw>kY'ګZM P cB7^kcWl&W3,pCw78_-rU­J_(POTvªo@b1&eJyl{mMm;Hm;_:ߴlRbsզP 72lۖ@ke]7}ABL82J+X]@[5 DE{V6+cUnl)vbd=Rg(?}Ze;,,]C8h:}>{BVv$&o12A-!*E% ygفcъj!r0m:.A| !XoS|-B8Uͪnn7oNO&_Z[_n){a6{Gd4:xd_5"'`$r,: )lJzDN#E~=V"rCyk+V-RrU T|)K R*:I}xc /;^wo6?Xdg?^̐\ǻf i=^k=,Z޽ΓءJfno_|ΗGJC 6:{5o3Gx\=dW}K\]%nV D/^ʰ cD1yыb|;,rJ6@Tlbփ*bAd1W%;TlYVS/N~=:?8'Xy3/G? %O@Wr7ݔ1IG3 no 2+e. \lsSaйU?-hb\}ʒMs:Ҷ3GX}Ik\'S{bt堸r7s|{D( ͒<:i}vuzUvj;Mᶟfccw6F @R:ZXiD:pT Nl+ VLɻF]9BI˯u90y+R֢-g 8gfre Hv%#+x#{w%]9)bWBMzYFUPa!RCpIN+K ' }Dm) 4e?i?Դ#CA7Yhx'Ü;b2} YE[,y,9ןbٖlEnrLX4YfU}U bLGcRnFh1I0$|"ey&V+=#uH :62*A J?hD`p?ELcBq>[8R_EW|şi?6zeh~ñٹW]=Uw=ZzA+ >>?M/sjFiT1'b0]toۓpǤ&ytqO,?X~ûwBw7?OsʬNv̓IAn-׿|Һy}]>r_׀w{}r[cv e ~ }4˵yU^8}љl*z\ l~STt*؊/֋ZFTa|?֛iaӾ>%H_z2^qHQR,*dʰ-*5L18 QR ކkZ|LDx }(# b|Eե&)&IQ$Qg""CλHB :\h~uq4}:6m3Kp)\}ػnա:tȶ<ǃ<,.p^ : @DQT"%Sh¬syLQQL>e_{=r PAu|ͷ Ԓӫ1=Ѡ^GBg4y\ՆY#hd0ۼɭ߽wŻdqTڷ95P#ŏq4fzjA`ďp@m@ŋE>T$'mX`^L J/|6}C0A߲{o\]̫n5qb~==66{I1(w{o#'O(~u)Rt>@FTo5lA\ PIT"xp^dߔBǔs.|;ɼأrfg\{2.3^ZJԞRSVXhKniDhE"5o4E$%ccHQ0؃ T =5rOV&JgITMvi/RvE2 YINApψQ(TSYSS8m&X]Ff%K"APgX&^N*ʡ!c;JM>4/YYqopj-o'^23y٫x(U "KlĨX2*"HLEBM5 ވ[0 .۷<\>M}9~7 jkwR[mo<} =,3NiAvQH%]((RcMX*dc,vqmDl=Hz>?Wsnx|6mi9鴝#}Z?\ӖWip=KYEA:i-J&4%J AcN@S($ 3$"}ȟ)0}G*#rJ*8)([Oݝq~~fM| n:BoRB,>:{t7QTDQR W흵i60%1 $5"Y^^0*Td (vLPyh"k+N,' S/K~1gqƈ Y(ɐXK˵ ~f+&d^%(3a/zv^|sPP2'I% H"Xd/VJ silޒSAm%C$O1?qBL:Ac̒0t 0XFOEk.U0A[kWji.m(\eaYC+u@}nGS/8zY뎲7G&@y1ydvϜ<ޓÒն۰ˁLR4ؠ=܏y3Vm3[?k8~cy\N х L]V<_+ ,)Ĥ(r["|̐bt1'K$jFG7ّEo/ ӿs Ub$L9*],AR8cy _Y,-=ꫥt# 9H"m&uI>)ze7qZ7 |;>~c)~I*޾܅%t`j1wTJw;0Qb 8X i}haddީL$@>e #jJ!+MZ|6 mDJL|2aʽppgC%%zU띺*QR29aM*"dIS  e|D夒6EHx풴X- =UWu1^/ykAug'|`ՕF?/J)4F~5t(PF (<ɩd_ =3KdL 8ct3~ TP.Mh#z`CΠV{;{  2w`-Mhlh $U B5Ex8HāຍDX |?GMB!NP Y\f_RŽ) YU)ۺ0mW7~ѳcaΎȡqbHq b*Z+P&[! LQ!z*"UJ*h=K+ilf5kӱpGg'PsGWXsC'!M;dT/% Hc)6Nyɍf9p; I1`8ޠR&05JI T!)7HW `]t]g#)` f%d!0`e Ԓ@,6ZJ-KdɋoPA*z6D|—ȴ8_j/mޕK`<̽ t\TWhf&FN$, XDH)FV@Z-(JKFcB tcGck"?M_k`G.U}w8zw3Rw<ŗ&mEnެy,;- d:oFhJT* v]։<&?9"?0][Lk Шe֨gbm՘NQDeUJ{#Nbդ;u8HdM|"aB%1%Ú `-bI1L5cGIg 6T.KmaSRt.Ko%(RH O4!:|A}l]SFY]l9Lx?{WFOt—"Y4,p3vE:Bgq2WlIҶ"ѱ(VzTuJ2e ш m4j3q6Rk5w} ˚%zOprUɶtbWt<%ʱww1by_b0{O[S_ljRҁ?κMQ J O2DBA.땎R7qg]P1*^''LhLX JURA526g͸J3,l[bߔ _UfWg/:FUMm]u}-S\aN/ |oKE$8#vӚ/L;m|DoR>XT DV@&@j;m)b L&r9k(dUV,G(>"Nuvf܏S*H^qoҏmQ6FDyD#"Y~/˘$((bD)5:Bp"^ +Sqx_SCـX'59PLdC63mSbOZIHͲ1"6~D~y#ljibʹd[\ԍqQqwF/vJa (/ Rp50#.>.ͻVڱ/x{.z"kF|5YQY$)Z咥k9zCvHY9Khf hfŲǤmSF2 A336-E $M R%0DX1 j%C@ڣ>e[lqV/n^ϩZ^?-ݲPy|K%%]. +H*?wq=Z7󗣫~Z蟃ꖫb@qiَ6)(-81'mS=6Qw'/2t)@фF{NhCdi(^4Jjxyt (Vz/]9_Wk^(A?aDdhs@0SX>AocqFy:ơz'VWnd I}ace]o?}b{nBk7*H[=o9l oHyahNze?sBg. ȜkDԁ1d5)T9|MV#P^lMd2Q)VKm'N)>A`SpoKP$=)2dũJ  `(&A(Mr+q6.pyvn7}=&NMh^M t!/{ڷx_ޤ ̮̂#_} }qjn:-ֳJZ& 7UإswIEIzWḃ;^M`\lݯ~ï.n/9?nҢyXiWѳݤRvKff귌"/LlKU=qntO,Ix| 1ܧ,tyqE,bo*nUذD:8y -n~[)ŏ?ԑ*㳫PhTF|n=6M.oDa\-_NzrB*W,PJľUR#\}-pb&ZH W)DžQ3(pR*WpbsZ\{m^jP\hUיC+ pUW \)ޒ WmF-3IOG/bu}nM,NHSO;Fr:_\J<gL'e'HJ55v5OfM6RWdxbG<\Hpv>iv"U78A|K^ϰ?zX Xڰd>O}0*5OFW Jե5|&\ՏwOt|:v|)}(Ojjxn6ʶPmrWI N =:.qڪtˋII{ZCv)F֬:Ђ: t Q`vQ^'Ww0\ 8<*{OJ}!> b0WU\u0q*-}*UG @;qHV \Uq?bi_BWUJy|-•a l8:<biWU#\}p孻]MguUmrM|t.gU-d/Ib}B7췂{ϼ$%j/@8T+R{swɧu;ƨCyXjnYf>+uq`_M^5|ֶՈk2E` JZ-tcDIO2J-Qb"{˨eҐ(I9[nYiGq~T`ǻf;1~\p'hn6nON~^F!&_O՛Lr{5yA&Tu kl%:!ܩq0\! r+ܟ[2\,Ntj6V%|4*Z+xYf#d@DoSR 6 KP]jZ2@J9 y֙W;^S<@`mA b= Ex}||ht:^S/>CgY |._W,(8lR~2R9|.77ڨOb} 9QZ yoH.+6ksf-K&|gZ)B]`]`ǫD^U&"(M1T(A;Bd~`T ^Wkw]baĜZuc;k&Άv_bvw(* d0s <5f" ZBZjt" jtKfv[%Oth-MעKYm&YobD aL,k6E IaB LTJO:I7-+( u0@*഍4 lЉB𩨀!U"&:Gž4k[6mCv3Mi/d GKC *%ū8*d@G]DeM=k"66UM1zg|!-BL*)L> (N:Xd2R>G;!c>n!Zol[ ^šdM._tn#Vػ D^%$ QMUd+pQdx o"CxGZL$׹\JB'%1%Yzö{2BSR>B A"IX"]p("*'iRI3}nmMx EN;mT0d>gDadጂe1ЄLJ6ĹՌ$qz?mcA6 yRzKASR xI3^+̙\(KF>oEgkG#i&_+ï־.bV :)%j<j§-~tq0xY5tY}Dɖ(Tu2u4vJ*W&8}8{.i(i5PշfAFǤE YAmu h(sHux@9lMQ(09S##aȒȻ"FPb:xm-n(i&ΆdNާ}fS)t%e2gKV&\Rh 9+2w)&ۦwu^ olŧl jJ2e ш m4j3q6R9K̞<4ӓ-sUT\CˮλxJ*~0}îk5wy_i0{OFSjf8=9PT?Rd<`F}OV@cӰуyɽ5C3i;AӟgEiZyގ`a X,%tlxi[Fx;{!gG-.xr~7Wf5朷tFVO fU>Ѵx-lڋ܁)ynK׌*N88 *:D9N%<)X]Ÿ m 1h Ld]ǗCn:oO< ~E^beTuăuvJZ_\>`{Oo_kH8a-4 ``vtxea[h?ʖtTUS3hX\޷cY\zn*@fFHD&6C`3^ؽf@0vc%RaU @Edz.YI/&eD7bK/a@eHz0,8 pZx[E @feZwv u|nnGM[>:t`u3jX;6W#.(I#|VX^y$y E!eLF\ZMtJrU6x\;O-.bFas"@YE-%bug^}|;MrSųWR-sMf^1b\sSժ1] Ai+D@m0cb$Pe1V)Z ^r#1Z]8H6c 1ؔ$#(jn;P[{1){:~ƮNR6W]j3`fx*]Qzt5-zxK2̠6f]!5;.su9aku3TptR輓!ݤ?(=b!R~Xnf x[z]|o>F-7DŘnS8p_jS/oM2gk˷(U5V?x4'x >k8Z# gu e#>g %or`Yz. `RW%L+e 4CrvĽwPžsLJCm&^?_zѬ@8|jWg-cZ 9yX $wp`I`Utd2 %8("o $3}3:mxn;;y8$mļ:/=NJ֚Mht8{)h}.n!ײRwQ..,X"^[K;̛+QӮTR\JY ۜ PKk>n\2>sRcF'a91,AEg vDz"$:^P]i9tueS+.0IDf.h0Ù`;&kI֫?nhe#]Ύ o胳^PN$fbL9K܆$5@JHаtdfy'gI3>87VsnK0٤Bf~6^&O,T%`ؠhyN//5>oTʼ7%yiM_.]^L[J AܝYU]S=dJD3n -솝]-%̏: '$Ug-7rtX >12u&n]B֏gY?5u1L)Z$UһKԯo˱^߼yJ12IHK!jp^ uX 6~GeɝxFMCoeDҽi񢃧_kUߧO~\>x ̞tީevљk?̃tf{%%Qo8nRt{Go14kj tcKַtԌhnƉf{ڲZ> Y,h82CdilUͭjX/wHiX6Ʃ7ʽ"4R6TiMcZY8}܋KZ"aw?so||\؏O4KĦ.X.Ypx_MK 4ml·os >>J<7zv?DZ2ϤUFE̜FIVJ'r_n6{qVEݫ??;^kEjB4B[ t?Sdcn_lGXW$YR8!LANy*7=wFYš%aw$چ̫4sp!bnwPx6*IxE&I$¨41uFV:49m=su2nNf;؞c9y>7=t[:qSR[2TR&Ce鮔%Ce-SPYz\޻>}1Nj$S=Ji TۛA*cUITD"RUVxKqG \ O#ҋD'2bvvK,%I&YjrX%;Z:FWRZn6xIayȵu"#}R>i ȝ+WMU{Ng mew'|6|_.Źb5+(vEɻvvWxriR}F'ڈT$2t<Y1 ӑDv!-z-폫3wˮ?BQӗi Ҳmi5‰[WڊKi*`T Ve.RBWР}3{wphud}'? y*@`d E)%p#=Lu^>NH))~Jf]b^n~~9tc}mbqa,CJ^zE  h T:z7…*c'"Z-N0t @HqZS#g"8LH KfsP&ˤUgld9 : ^wΛjWz?&$/w MխTb{j̡c}GG-RBR ,3hPٻ6r$W|c,3sr$xcK>KN9bwKeHT$=ĎEV7YUŧ<]ȤM?eUԑ'FSno`4=g<_ZKƤ!M[`ȌY H%9.p](x48\]hy<tYz̤ Z3gl9N6^1HLB#D)j53Y1t9F#֕T^{Ml.\J֟dQVYC m@QE. X'Y /0E6Ay)9͸3s9n$IHKHg] XYǂ$G5HtdլIt3%z +,,$Ĕ\Js5 \EZy-^Ol2m|fWFVlg]=?K7p"E.(g6CmԗP uRKR3k-b-\Rurm CKov ߩ ϿntvuR?AOF?j~^O~+D&ݷI;Ԥ׻eٗvo3yPzs5 ~\v?ޮ|n o koKW:Xvf6bv}]zV@3zPO'Lq2a:-Zy}i.鴫:SO'nԻH-˭]ΰ}PK#Nj|IzWhնfJjU eGy7/דۛiZ=vPqke'k\P}`6;&:F3({)ff6]Qsi &p^EG=;B&SF|FǬ--McdMIBU,JAma< rl2r dFwSeh1$ ^Vֳjg^2D-(>.hK`-͒@T iw"!IK9z&8)OrT*kN?7u2^$eLkv y%28h3)8FkZՠVZ\qRs,iV õ&D@RFR ( ,H"pb8Njkڜskٺ[{}{Ax7MscHoVy4ɕyq'\ yDF(yLnԶz,G4)$4SQjSd_嫼Ζ cV!f$Db\ H#8Y`1$qdklz+0֤ݩhGv 8. -M{*K H@aHթ1yMpWO685 |^)%ҁ|NsL&혒R-))NJ^|))TA䘅svJ-NX`Rptr2ǺIJ8KJ+I><^Z &ഞGtiuFFhN+jlǩvNHN'ov y08q+LO"WA5oYC&&IҨbpE)iA (<U{MWKKi< DxCԙ$! &9jY_9(A?X&jhNؤf(},MB]L,g$dA[w9,KWr\B"0'!dO 5p[贤[ډiԱFV %30f$}J8g$8x (utS9z}պvm/87J`|pF=weJL z72:7-uWسӠ|E/^oaL>/&sGُ^sjo(]vYV)=i"y6I&x!#S)y׼r1wl1ʜdL.9m!( U\Hն!Vf ͌=mdmmIWtۂNv[5yZ\·_~wC./g//拯bh<=a QrTtBi\4Hi:kPZ+Zld&Dz QHQ̦4,}+0WM]/ ɋa%X%r3T)4HEKVBY,WIZVn߿AHvq,tJjd_(+E1.>w>HJ"QJiG1Ԛb~&lK&b#w;m/>HF&iRvL~dmDžճ[WBbN+˝B &WYAaM#\&<;'߱GǧxrQzcw'XNAYI'9rrљtҥ1J3UZx>ၻ\䴜Nr%^g< &H<ׂ!kZlн]>=c%f[{ VеOYz[B9Tύ0?\I^-Ɩ<}A]TfS"W@JkbB)J.Fh^ptBx?H>aF.Q ^ k@rNKDdnpЉr2 a$-Zyb*V<,#Z%HH*ב[6DkggaN Saeϖ3oKdg|\Xg~N(xr5(8L ms#:ztT2::"71]>8 $}soy35 'M7Y]$E7CP0hKX?i>BM7ij?ֲ% zG}xÁU굡QV{IA[c+_JQF t:gUgh齳2 DKC[vOퟮbop*%V>SGop:Y}g'nL^rk軛1ӷ^mH{2~~i:hzν-Cat;AN/y1:ū`jJκmGLPOxo i=ͮfӂ*hC~@sÊ5E(⟭..Iuҩ'RgMsL/j-O>.h:p}dYCj,ӶQɛ:`n4<зJV,BYi5:>`$c3Qoٿ0Q[@Xr?Xǻ, D jmD- yzVǕ1c!=a^9JʓbNJ_gyZc&FĐ!&+%Y{%>nH X_yВP"J ZU:+̒{6$oMT]FO8b²c^F`-}f5A6.i YYUYy$FRiEЁHO]Ku%GpO\O%;.F{]4k% LFrP<1C*& P[PT(F>RX4@9<hMPy`rj䳮wk }UY| dݫqʹ}Hs*jj}gr&i7Z(*|`pP}i#I %4:q~gmg{֎z.P&sY~E|t&4^(\=F <Ĩ$H-v44a-~6TH<]vXM;Caw1{iks턅sC]묦:0+,1,p)c>*dPRَ5IWd 䴳FF `#VG#$%NEYdIӅCg7oԳ^ikCxv]QJfbV6ޮh-3YBK,w?* *8aL!G1)"vgVI8[;v,Q_HwJ̭(z Q ^{=ZL΄tLyMΠ*vj+kRxxmV*affwnA7WlV&٬m~֓DudK]ryt14W5\.Koё1VV 4˟ݷ{͋*]y9收K'ݟT8,-6?mn+;Vj#؝ܛ;ew>cMw;;og`2+>L/p'r˝[pdqso(m ʥk|bDbHGdA`4R!QmTAf0!2&IB\A$2$ss9J<80Zks Q%UntND ]IW-WpZ'eRו">ElȚuqb4ɤ}y' Z ,BSP ljFRԪgv]օ~]]FKLEQ7_Ӥ{>pC-\y4gPZ|fG1 "B -+@FV jğL<<8]o%UbDKH$XJRόn%exxa>疇o0~Ώ"Q6f/ߟZfIޚh0|1Q% .Q` FZmy+해vVSFUXr*ߵ0mլik6跭};7 Qڔ%(_VG;qN#\[Oу2OOu V6-m;!y>c ÃwZdO"E1jCCtAl*Yۮ #a:<0>Lc t;& qgyq;qz)ShN[AJUԣy>P?v~Ο @*\pa0hno^Ǔ#:~`kQ~pa<7ΰ#z{tXnv'ZyRHh:Lu5s0cMfW*VcKs=٘>{ߓU$N2iR:yFTxIJդdk7 ޞ)3I134 >>c'&P F?}S?!Ѩ d4f9͗V0~0y) ~)giy^C,/Vai}#Ņ7ˢ*?B;sg_q;Muy7/@BB̋h< edNgBo}Z,h#QfFM&a/O^ Iݕ<{}^ѰUr9jJϽ=J h7X[='PH.fJMB,*ADL(+X2nWTIpZ?0]a [9HJF`ssCin9r<h#EF&E w(?[UD~)g}߁F bȌ8)8!:DrP'6i(}mO%~i7Jɚ 3 &H##LZiYFzT;#BIRk% 8^$J&TΕڐQFB[RK'sk q~[|Z)w o$,geŧ&Z rdge9|ӼeL4ة}-8oP(" 'SOL͛O4gj䪫 = k}>.)4f{~q O&>GH| -yI;-^VN3ŜBM:Qb#(Gugc~3Yȑz|P %bo~iӥtPFS1F eEm"*kFą!5Jh!DAtjo\%|$\"\xԕё xP? h.2ECJ+qFR3}:mM<3ڊf*xjW;DGK6~,K\vɕ&sy1&\hjU@-^kjE4[6y]͹g2j$}_5#OhekV:J%9UZ" մlRy--[3[f,o[~-[ѯU3rnkEEM݊x.E](F7uW_og\C%bQ-t%yƀKF~:kg1=5"154b@b;LPjZôןޙAp?VlRoop"*i8ٮQb Jə)J^F)W#3#";i`>d3ٙfp,"cmdkə,_Ւe[-)Rۖ6vjvU^VIRTpVCWJz5_zŭEd3VSIDɋm^ J5! h2ƴ WDn 0PTS;%l6,i (<JwZ%%91(]f¨5h4` ɠ7ke}':" ZYL9uX}{ObFj/h3+[x1kń} fVof_?rD݌Ƿs0g[7סa"Y;y5ѪUeGʎV-hVr>]O=]wkYYF5&8W"tWp-L>Զ2{k+&-Syˡr@6G8fG4C-9djVW,ҫRWD >Z.>-(>&3ۦ?-onf۫&#ūOΊtǏU9V]<`PP空>|C݆aնR~;e,/x)a=K {1V!טbmj=tkKK<[[_Y0f,{ea=̲YԌ,3ˆueRyrV9 (NWZQI >e?AހǀZzS]&ں^6Pۓfr;‰'ol.)B9uA8dp8 كU3c szR%|'T(]]LgI/ȡD4?nn9Bt@Ŷ>oz|3mgLX[B7y&G|ԣ>[wJYIAjeAJ&/ 4nF>Fy*#<dRkP 0pr2GȐe!,jYhm(8T'H'^]VK]u$n](-no>jCf?ِbi-u"'F1}B"sCPJ;طFxKicR޹2R+iW/7cs >1|=+*{4zaͶҲLڗthn:u!Wu|"RChkQ`u_vS-{}'> DߣvCEA9Od9R9r Nb V@fSA$7d|5Jv6ZKaF,SL> 4NH3Ns/yV$ft}qڤ3rSr|h#ޔ("`\>Fð>51{[4<|Ee}/gOHeR#-EL ):NSBJTbE#^nظW bx 2HϮ֐N! xpȂ4)" A`8ttS&ڋUa1z Zߴ4j:?MȒgEThVVc0 P"Z5< f*I!J҂Jb<T&htt%&y$ɠVcO7jziU=|!'&ISRKԬ*M9.I.iNF8ԕ M:E&/Gq/M}r`h!塅:Ju Cۛc+ǏG룥;E.M3p<ٵdO5oIYí83JdQgt'nPu {jVsܝƝ07l^>le]s(^WCu噟,bWUETa{=V­ξVx(I}Q)`8֞]KDtu|jvSA`'2ʊ*e9BDtn?%g{3İO٩}<]2#[-uv29wPR{ 5KP% %.9em>! ySb ih 2 JW12:@ٱ9[n\'`v)yQ(ƊP:J$S{eriCk[6ŐDy)$dR evJY¦dKNzRXSyщDdc$^a̐D$OPEԛ] sf(bif5º n0V5"IR) S0 lbzoqؿ42JٻpgRnI>:! Mamni @#He4`NJNpLIC̑qnf:HaIAPaP[ 5 N.}Lߜ OA5N^'vqkP$c"r %9:.Ӝ ~̏4h"Zx 'I_l#Ł{d~MJ?y,%4A٩.B0dh2%x~Fj?-7/rtOtP&itV&ohَ݆?Jiebpkeݕc\ZwԯOx:99_/2A,C67ф<WGґŃ]_>qFG:5ʓqiy8vZg~CJNJ!%L}~qד4ȃ4\}8qk 6./N߾wO_~8})ޜ58Xꋷ`$AgL-O-fjS7 y^Gn(~3;  !~NbAHsu$nL*z Mb~>; b3DMQ%; 9*%3YvN@e2±3@RaulBTC+!5|ȥ6 FD5Π)8#ThO9(Ug<# Lhf9x{]K+<%Ʌ揃_T̀-ļwErT2)@DLh &:jq\|HL7*<̭%F(2dsp#JfܪhER3<.u[/Bv%Ftt7NcdLZP,r ZG|B:6:#g7w./Fxs8_4z2<+r$(yKId@,PoM^RHRMN;m: 1*2Eҷ %YG_Dj1L>PUc cI^6X9 rk>+d:=%;աxH11M !$E mZS\N2v@2vf9! a[am-A}8n;n_ஃ( ga^**{S$軿~_<8:8(ι_PQ uT"D%YDLZϼK!F|T@IɧU.`K d"`!=\PɌhsNx6%VeRI#{f11_GΉӖ1[W>y!I8}y弱fTҖ%1tJ ibR ^)YL'4\(ad:( ƀ jgl0d=מ͡@:])CM lq,g=_ēӺ0gSt#&sn)KکV]-XGH #"yny9hE;:ҪX@U6Y0N'E2]ڜ7 5#1%}&%յf쌜-*8cG]شkN^>zaEIu[.q7G>'78MvCGG 9hf?{WF ]Ό A]=/F#3"V[&5ۃYVPSCh%CN*rJ3El3="{ҠNj95ջ[X/Np)ٌKvՋ^A/z9 jNRʆFVѱ6[S S>inp*lÌܺFn{_?4>q)Cd?zVُaׇ,@:п74/+./>U6%'ӿ]~O#ػfepqD/?0j/I\S9,=e2JYWrÁ,A_;z5/AeRaۯ{3/]\`Q_3/0ϣɋY7QaWYD4/_G&gKRO&ǵj;o Bxr6yjuh(d }`T9XdyWff19;C*r򻝋XyZ;’"BNC֤b, 4;RȨ1)*f7DQr!"vO`` |Yb驃Y3r븺-<è|P@d9"u)9a ^I Q.H -E[BO?u `*0FGm<@t!oQH^0V3'4IQ!U^b`V)H{hlhB* h Aq4QO|#CUfo+BF0PdupH!f[q>ER2emm)AED:"#/;)l" o,|dm)gBVBIl1ZH T@ ST^JIֳ2}Tft4n F%][~Cz Q Cny``>d)/3>vƼ=/ 2R&k]:V{ZAF{Y!S3VܑgvЃQ|g{hLKPۺLg)dXyoTY^ (f^U,CqY)F彔(t"DBFYTNSd'E&cjcuwVlv_*Ty7qj}\:ny,l8YaZs*htDH n˃d\Oct~:GV="nɋY:a2ej~{R$ҩD,>1%ObR:~1zg+Qh2>{ _n݁vy|UiRˣ:w:a4}|zʿ"%{Fo4| ^%es1v*K1 ]5r4u[?ۛ_Ǔn뽬wuoWhZw[?ʝ^wcُۤUŘq;֐Yw7)^F,ghV-,&=dB^tYϿ=}z Μ?`.D6".dA6Vt)̏%Z6B~J4wh5Pb Q;h0Q)NrΩIyLFZE%B k2dP֡E&Pb`B6P0y:`@dPԊ-"3|HScvt?p+a4ypcG?~dsYvkM3 fOW3yY:J ;CZv:%'TgS2$]Id"Izc:Ćσޢ6O{g|Fz`'4gӬR7T14-mJtV)N0^QQ'CNtFPy!m C8qO0+"b\"*im묥κ)USkBsj&R"!r"R hB8ˊ"R[1N8 HI_1:)D'g̡VBl$JL̙"˻26k(Aξ3@^֌’"BNC֤b, 4;RȨ1)*`Wr\=oeb|\jxb}v葜#c0P9a ^I Q.H -E[BO?1~P'1)g`X. 2 .FA-d5ZYSp+F'{ֱ®uS\UחQ9r݇ )w L+k̇l)sU3eahόl~U !;liaQ8ez^ q,tJ`U "TMȤJsbQ (cO^Ia^hz^xT^8Q뛌[u5fDnY$:eR;qJ8M$@ٳRhʞ pgmr+!(fJmc"^u;K g7vb .F'µ9v`Ch"syKCRAX΂j2HہS$p#ūҢKfIB*pIP@ !Ԣ= yMK#(`]G<*QsņJSf'.W>刴0G=G9l:O]$rNSQSq-gZBJZFˆf.U85< X|eD£5i[E3ܖ9bw#.C8lŨd_ EŞ/nk&A9&FhZ#W[P52{>cœ9)E~,l֒9~r?2S??;zGڍ wGFŽg2/~ځWI_ :ʊ*u@D墶 @8e/}jx⫛<#q)R]ע3بtL i-8"#$QcbT֩QskM'6@4q3&H|0F8piWڃpvcutJeX͘%#9zb5T%cKi5U˃qVT2>){tL ŎPLL.q WZ84EC$?@I"E^HUS9L7s*%s*`CiR .?nF?Z vYp='u_p`QFGυV()5; D)Q P)1ZZ!ZXj3yX(WQuc,BQ.C]Ĕ8\c,3P&XէqkއϮ[DږY_FFwWH!m Ȼ,eYY˽@ڡXLZy81#Kykf_/cx$dHkHeěN2 {LP4b{c* ǿjVWR(b JD2A$jջjE^g}1Ǫ_6+vJgc%)sZʠ 5Qqnq }rƀ}v7^O/O^.Q' [q679H~|',r3OgE<+ZI%? v.4D)SwĹ@vw=P9|&\vDvhISD3g L뻟aNv(cƃ,>f~ {9B>|p(>\Ry憿۩~o_[O0Vqhh}E=mQZjG'ZpjI# F"o bKw/ ̫ KKEf{Fp]޴hk rdR3K]V-Gq]3aVUi\<كvlgӜ93^}1n&,  JP\Xdp*T^kfE4 C0}f›& xWôD]޹' rdKLR[RQiGF8gHJ9jR1ťaQ .]/W ʧnǝ{ cn23l٥[>wyH[*xf{5ؾ-?=ssOPIJ\CWbS6; +;Ζ|<fMgl#۹o(E_ۨxIn/6wjJlt9ܫ;^Н s1(q=" :"䂪Mk<l!>\I qΘ%aM['>^$?/U.]1PMj%sI.Nޮu`ectyIVl>6~: ,xW̒x-w$/zs.nQ&fcJqW8[wX')_3 LGH$ܥR$,j# k 9?J7 6[3C;R.<)} \P~4}| "t~נ^LX|b@_9Sڵj7S{ z[@Ȃ He"eT.{m%ӞIRpd([ނ\y vB-I$y *P/ !r C唷,:-soPqwR *!'-/%Yf"[; ic\kp DpJH RJ Ck×OŞ)?_͢*㙢2K6b٬jg,%kק8ÿ>$I-UwVABDF!PR0ҶWƓؽfH0vI\`QC@&L"j(9Lyd"+0!.1r 1(\6 ,h5T"@"&0-6UK2P%t?ݩQfY`6j*{D$}ʼp,hf:p+,1<KXB zƹJ+ !,IWeKPvh(Xj#V[jI{SD 𒣖'Sg}{RKŽjr]n)zEY>f_yΏ~G߹By& }fLM(kENTp˜J@u>R=Ex%H\AO^;:OBg's%NFSŃG5֙p륋UxǗxDuYk]OpUX r&8:n:IOqק sb [ߟ[WMt&;:ٮ^/٬;ĖU}3=/NY#oovwm]hSڕ9|GS097W7ygMw/e^yJJTĠHg2&)ѤE|w9I sDP$ 7&TVr`:Wrq{q'L$S)s ˳H"X%R 2$&QsUAɲUq.ZIB\$2A:#:j>1eJ*+]pvȡJ3ڂ\'UBr(GX/ /~ࣇ19Q=SE<5r(LqF9΁zx;-8P/u~ 4ywN~I89]60~ӏHq- yEqpwo.O \ NXG5^0[ACVoƣI| ߽~/fl/^3-H˛|umTfiE쏽x`ђ} Z¼]I[αoGkn3OSpm^H5eKP9U!rF:P3ϙ!M;,YֳTX^8gYlVߴRۖ-E~h'/R(~ kT7[RE!K- ;*o;P@+zQ+z=zx൭4X/S{ҫyK͛RJ15AV' -U)Hä!;)&ep[JfN)Rvث0ϢpAiyfFyrk՜rsY1 Jh$P%x\^;*'ӊ%S}Q&/տh׹03[-*~!e<4!RR])KF6q*Q=1Cd!iS%H9<1RHp0|.n\D"Fw P%p:Ns:?!Tr?G*L̥ ڲLow +-!\oZeG{weMhTָɯ<3.<~ kq&܁`ӷfK9U2)y? smvb {1dG+*fAX)5`k?EC6gJ&WV` $"%IwSg2^9x:oM~+7P& -,IH/aG BQt }tevf*6R䯦tUف\R))h,ŷUomNz11!ӉO 'N@|T3'jTU=ٵ5yr1ŸՃY0 )0레IT_쌓(+Fo~7]a(^1Fbu$t6 iFafQ-ŨOiBWcN%:*AGLmԶb(ZÅ040ܥ/ʐb>٨>}N^ҩx_w6(Ǚ:~?I?߾>/~>z&ã?Fh7; ͺƛ M[eh嬛}\pSn&pr1;pS[..ҥP04'?Jf^o@"g~ d~ZjkT3ʝb)iJo Wfz־Xj]%vT>xF0X(߶0]󌨀<1+ӕc-N.5Lg{'`I%2]>:a|q! N\s4 h&AэZ=wkB{)BX47ɴ@'~#IK JXJ~+/f,Z9w 6pu1& WP0vY4% Z{1[,A+2V;Zi- L"b_5NBvō.b6d8ʾ"C$ 6۰wz{X-K}a#}%__)gSQjQkKEiR=J T2qB27CXPٰ'(P?ȉR4g Y{jj&3Z1wu[T.1M(谷."¡Y`FWKEۺ37<ڦz=oM딖Ni8<߱Yac4jZ3@Ϫ[u$my̷Bd>epzhW_wG㺩 ;<4KuU\z{y zmnǚ|@qc12gH(FRgZ\o]as9We%vj ?e1N0xfYYԈE#sA!PjQYEsŁJ(mEMc8ˏd0I7V;]'w P By fJ a%ÚќI)9®.WC1Ј#D-LsFj-IAHZF.JȒN"K msx%<s ;Y-i6tw@4*-Ӽ`>0x ,b3qsiXDKXYnPJLxa?Ip}5.>do< mT1*(DO! Z7`4gbeV"-A6=&!YhʪdV咁&f|3`ŝq[ˌ«{m搨/*¦Yy6 #5h&5.\qXtǛMl v0zsc^4\ SaS>+/9[V~_]ItZO4֦)jU,hcvkp.3tny$XLjAmJ~~[QN6y ah2h{v&h],ed3&(3>̢ vbW \vvzW@+vv}"Ȯ8~vsw]AF*AK=zI]e`&[P4&˜i$sέù#1dhZ$4*nvT!I1B{p;|9 gbNks5&dAO7Jc<ɽKO/|TM}~64E%^ g>'/@8U(rr87(W^QTHn 2v6"%_xx=EG᠀呤zoT)-TԎ@:f t$putȓZV)06 xI8bS 4%5&P Vrw윦"n#wR3ELARnXްxJ B]Aj~惈s|̙4W̅PPqSgz;q=Fۜ~ۏK{pJS-E֊֖+*`= I5!/@E!7`Y 술d 8 %%c88/L:V҅j]``H˙eF3GCwЉn֏o _ oԙd^ 3=ϼr8A&7vl->=J^.( iG\I V$**V{[ki403 NyOQ&}Ǹ lM HNG,"r)IVGIʍD}$x) AVLZ"$L4QAJhב+TT!H%YgHg]}{d6LQSUH H7\~ IU($㹧T\E,5$ IY g=skOD) ;#)'RZE$f sNѠLuP0T+6JM :A)ı`+n'ٻ޶vWЇJH2C[(ڧCQdq.@{[$El9YrӢYkq SL2;_׾Wv y(BqΊ5ZR(dAFH?oˋ?¯ў'Ej1f}6(FD2+Z>PbhgpAKuUTՓQvr5/1&\N zB@ vgʻVcX*9<Juo:V=|8?3%_6&)F#rqkL6Bi ôP=r=[89lIH&VϠH s Y&+?p!pmr+%ñp/*-*Fe&ݰJ7[mei Au4ƒm}!t˖#/|5[?C?nZ&=k$.`3C5\iZFkņU4g/Q&8\_ ۩V} KUه8-v:=Xvq$Vd'1*&UIzd1bp",kA[ȱO]W-QQ( CEg(^l )&d'Ql|,NuwJ7qåSn5 ZZD""NqcSh|`O)eD9&訔W erڐc4/י5F.ASu^{4ٙ$"JY-ݺ9%PQTͲkh8AG^ CGM,ge VL &4E߻wP i(ZfECjlsM%L(GPxe!Ԫt|yo';\7;غ^v=.Co) 6s jɈtD\Dk+p@RY *B14zG<~}ZN֠xiS9iZcOe~r%ڞI.P՛?-yigGD^ͻHHd!~a(Zt4ֿ˅Ƚ@ơgw}Mٮ ݆n@RV>a A= +5gW&{+jM1I ~ vz=%FXlqur+q󯩃I1Mjx@ԐR,bţ3!M|)XZNѵN9:Yѳ63ךZqV>kO#]SNΡUI}/5bZ ] IX\YZzU'# -D#ƫ]qD s.|S Y'Y<#Cs{1{B5Zy&b>jAwtGsg{>w^w: QRBUCVdV6@* c?>Jy^eIYKTZjtKrK UFp1`VˈXH |4LDSb+'R08;ۺL1矿z%ķuh'5yNoƬ熸lk1`CQ0Z(*%\|ur:kl׹ i[SH\!QԱ:3&k&"Ad5޹ٮO⬟@/ϯXkV~i- O7 ~YgBP?KgxNh9MC uu:9&ءサhvio=t׆FOOx{9|gݿ):&|ȫ97lvg'cd٫*LCP AЪؤdw:ٝNvd" kJ]FtFĬa fWs]\`66VU6n]ZeNB6V wM^lB,OrZbs˙c3KYw‡c/16Pwy~=0o:\ I 1yMb9^Y?`ke}6&zG!G!G!Ǒ!㝶 qZlȂ"Tk (|b,}Ij]tNŸZzylI/\/Փ?Z;[|B͠duf&Y{EdNȅJ% d&2TZ)9J=Vp}5Ea hDGI[,ŽcEvgߢEBu?27)X̯GN?úU/ܷ>Qo|]w_QXꫩ9uLל)/Xs&;z?OWK`ZcħCPLC $w6;XӇzx_*F#U:y9lCE^syB v mjۈWkis=t3A)Ģ 2+dlF W!;NciKʗ󑯠~v qy%w4GJb0jHAM/JMcNn(j=+WyUϠ?Y;d+69`jkIRJPB,@٩AM NP1;KON/6}bjZ;NVJvMv*w.yd` r$k7j€lΓE8Ehkqc2 ax>U!0<9dYgL" }^,JPAIC"g'Ud|UYkd'QևbDgw=CCۄ.P۳;K\ל]Nxo raU'7Vj4ځ=%q!Y3*զ@-C bռ0cm=g=|I'+"Ȣ\-#8_-=A7 Mh5|Yh5l2%C73_IA@o D˔Oӟt?N6],귓o\ ,'<țy _dv.a5z #eZNz=_;kSz׿QY,0ؐ0VyUF}ݞ6DǢ oYq{1ec6~їo[ uR57{Q5xX:33xJ1ʨD[o/}_zYm.7]_HUTdX)Ϋ7i9 ^#f!"F`KGi} N `c&jEZ_/y@;ߡd2wSՌm\,UVү7roBjLQ}6Pgd5%˜T(Am\B*c L($+װ%}D_[|lNGfs³]wfSXUX8oW?{Fn/[n]f~i`jk-K^KNh$KF%; 8h <|sqQAuEUgni+JWOUmxkO5sw6dk3bӣؤk1<ŀ £UǜI]ޞT%ݗ툼{4Uٳ#~꺈PxW;{쭖<Uf0jBPFS)E崉S饭d!t}D:Q 2F,Cǹ2D&ʮA!:蝉ep|}<.\^dI£vm5/goW=tk*u6w=3?-. BcJM,*B qW .FM,1nS$8Z̓?P+ْ7HgJFɃL՗ ъR;"ZQrw"pOݏ…{q8pYY S!VU<̘@%BLlזS*D6ڄ }Y/u4muI-`V@E D伂$R",8.0໼Ƣ:5sAځ8OT2[9_9PT $T'\I)'vf=՟y8O? LCM--/gh~ CCWCxm]>u3v->>ݯ7z?~!q7 o~WV _Aq_M =tQKR1@3_Zh)_ČzP Ax!Vf:w+q$-)9gq6QH@b8QB i,$O6t^Uodqi#2:r/8Cf֥x !W UZT3 V˥s.u۪5Qz[5T.b׫_"r҃K+pU_:!hֹ쥄=}պ)pm$x&7׃ԁ_ JP\XdP!ʂkͬ-Vynm_Fݓ]u:莴˴A^CaH\kJ=!3&H|2BTdvO.HՆZp R_/}7vj )pޗ8M]> F'o$z?˰dWe5o ט/]lE@RGiTT㯨TyXWSor4}R)ލ\go#w5q[}v Ɓyk@ AǻzΑwW^5?5Peyw^r<{n['$G*M Z^9l%)a =©p!z1'cRT$xW<:'Z$8O\2Ox]p$& ěqaYNrOמKð1*^0I!ҁJ91ɈC(9jXp(k֪`d %)T7oaNi4hK*&/7V9]Q[ð3Ō&8`XyL"U` J֦*m#\HzI?WV*J/EW漴x%["-~BaJv}7Rtލ c*V" sRSIɎ<1N&iq'Y73Nic׌84 a6আ=ΙpFP1H9EP88&J@mrp0O*cK\gl;d5prsWpO 'ٛl 0͐>[t,ӴT]'ic${gƏ1޾)7rJ7z ZѦ吳tյ@hsK=>A1O:* Vմuʶ: emnMwQoBf4Z͚uŚ v!n`4-m}5 zMҜ=Y1kBl[{j|}=k܄Zh[Jҽ{~K9Xp"(R lԛJ*+M90̓ ??7(Ptύ5>qe"J%B!9Ȕgɇ Id*(KB' % X: q#^^|//qYOo1ALG ?M`?M\$ia X I"`?!`_=Puxb=ڽa|5z9 ##`I%T*+ __v|ȱflO|y'YJR 2IQ$ɠI0/8Iqk#!蔠4R gx@ ŵ%SBe80(3Y`jDzMt̗;gg}wGĖ̖/)!7}?xpX/$|6/Fsa|0"N$øFd<#$`։%+XΗN樫ߠ7t}4˄i k5-sڮo)$νyZoS~i,:aB(em !$ih YI/JEUio[d7m_Ʈ;kȱe۬W;~u,7&nYl8iJF &%0DŽ~"mr1^gաuZ|+hdjRCv@]h 9QFCט=F?yo;4$IQ V9ZOBDF!,>NPUjqV=VgդT+p.7:3 0$cLLyd"r/ߞ㪚/ޮo!?v6.j -*{; {./w : #KX>*dP\9%|`oHN;kd4Jc,d> .i42'C$M<ټW7skIv^Q偘1RRԢ-K-RԢ-R#+ *KqRܱw,Kqod&d {5Y P\Ekjeسd)d5PnRklkZjkZjk1ck!7kZ}J5R͵Ts-HkHKsBKǭܰ;W4xa\U@fJ8IWh89ڮO` ^Ah4sB  $RtQC5][3ND ?Kq}(Q݌1׬@eWH/w3Ar R,wʘ~t ` !X#U̸@CZ~g(^N1Dg(#9&H(GL0'# (Er+`9m/_1"yEynuo+K \7/xG9DABU\Hxq%a GNhFn&VS4ϛB&[%$Ʃ $ilb1 kssL&J+Ni<]U0+\-(k7@t4@'<؄0JDCOBous?ara8C| hy0p<V\ 94wVk~r/z^|Om:ec*JWBOR0(慲Dywݛ~l:gAj}>d:`mE-Z-gBo?Nc)6z5ʂR{YK 5 g=z{Hy͎ٹcksy6yV;i4J,xI&[Tf|7ī|]-RӦ^4(5sQ c7͆ }.ߓ YEʪٻ6$WY.#KU09'vqpg]6/$lLl&ap< 4=eWG\C/$[02*_8yoJ߈em6|AM?r ՜([=!$`(!S`&ˆ>]u>Ida P<.f:>۴$ĔKDoWQe,:$QuшH QzK}=ij;inu4ՍN&7XH!}:[T{R-}=%ivI_os%Mzd?E؇೷ FBlCW~=*l{=`F띹xjScblxRiVO#jW\2g/M\:z6F/V(EZdGl0%u:;ȹ禾 mEqz3w%i(ALʸ:)\<:{N};uWan31:O ƴʝE"1ؖZT 'D>TOwmS,_u90óyk^--X>J$Fx{狯>hH XbZ{N",6  Ǒq_Uշ>/xUe5fHV_0*%'7gwBv8??,tbQ߈(bnj*hXJaĔ: M!+qw**Ek _SR`uuA!xs ?\Lxqm*/up|l`/5_LɘߐQa&?M_y\à ||$Mi ^Nkڲ>.tPdѪYjwj[|G|o~**h?g~SoeqlqzRo;Eѫo}ׯ_}WڄW٫ϿS.P:>D@ { VC:|=.*׌{]|\rN˅ڗzFL!WkG[hO&-b~?6O#U}*\PUhT!l  =;^84޿\ŵ@IomӐ"FrIZYkM4EVQb'\8E]L-*VE{.FɆFmJ8–b`E(h=u˨NHѲ!(HC>7xR_t5u AF-Q3MWvvnpvOƺˮWݱ(paAL]yY/(o@ɿ*L~Wu# AX}g/?#.Kkw/锢*6o0ϵl1Kݰ0_z{KE,LՄ bEY+Xivܷ_Q5R֛JL䒏 QҪs uJhACѝ}n뮢ʷLJrBx::J-' bmc(A+UWh..$e$%.`bpR-\Ƣ5VWM`?+ՄtPUBg oPTJC.:TZ*.fN7v2`)hRE>҇ʍj[K;k!$bľʳ9gT )b,@QSmmbo Ed. &`.ҁ69^N$Jd,ǿ5bő ?>Y%O G*S1CQF.~oFǥvuaTxS2 .L2Qᣜr.2Ow̖=9Xd|LɁhF9^̜F(n&_X_LŴ|B˹bGݰvu0aqhqs̏c:;YT { LdQflQEыRc{ɹX Rߞ6mhҥoOoWo "t_jVy]H6 j[5ښ--`e:X\Lm!SIZ䤓u6C֜ D]91; h̔AL%,{ۺRQUKԯS)1;F=c[U6[3}+a2]e#& ȡ bh0&|d duTJN$XQDq Ƣ H{JgAF݂ʯ[5H-~Ҙz{-`n|iE%f'#~MlQz뻽, x;wW]݋m! FQ֧HM!2d|VӪaC1u.JH VQ=\B l(1Z!y&ŭ!UQR5c7rtӅ8cC]%OƝ.|OϚgXO!~8~f=i\⻕Tm2&PO [+"Ί ƶVzn^"lƚ6-u@DUb;b5*>tȹ[c] 1Ekw-ڛvi/Ń%&WS, )8ʉr&8ёS>|w 5H,e/Xg؄qV@CL"{HP>9w1_Uoq[4b7T#Q4N#޺u.gq9: :)UbJJFݚymɃMdXˡ xbD,;knܭW6:qɦzv֋fwz~"JJhu*^oVTmk;zqk~zqǶC>|vGESFޫ@/,`#H>-w`Oֶ`GpQ_uNiF\lyQmo/&aeBL|Gdz\<מ%z[~FYwUZe7i;5_MN߾tn@- jo> >A̞֜0+EmҨS v  [8veBnOmã=xԍh"ʄČ&;XdR,7& $+ňpEPj5|4r ]_cx Zg y%ѥwv9@މ"V]}e=i4ͣ\hF񃶉S,ޅcR.% Y(^sd&) gVWD-$\m`Ŷ',g wlY,%V0W +9G.x1K՘D}ׯzcJ ܊gl pζT焞?u#ru7U[UNoKE%@I&!]rs"6x Z5GeOv'=EυzSEU"$ k`q0R"z))QP|\nV9=I8#yȾƠ*t+8K\EIS)$*TS]V5IÁ.U>fZ+ BbbWBcbĊr"3#VTv/#"V5{wc} K _wd0fۏEqY+PgX 礏>e(s,)?zeѳKo-]2ۦ"3I=:zr7%y1S@j:Ή̒ ŸdQ}efM2؃[f(}c^{PT/|ՅKh;yęL\䒝#VXwL:Ȏ)dtDrM@qE%٨Vc1-FOgx^ś7F8r@7W'A^/BY8riaRr~8?Gѫrhfs0Gx^nC{|}|݋9ia6N{vzoZ^jv' yŽx iW>IzNY=m>b~|-M-oi0n9']xRcbJyĆ,7OO|ʩ;:OZ產TܨSENzz=VۛlKD7WgI6BOlt^^"4Nճ.9WUu=Ӽ`h((b~.Rz2ҞT ڨ̴;VrXY+&6+T hyƳ#]]AE'DW8Еj*tEVNW]MEei2t%pMmbPPF>ҕjBtl3 Ar ] NW@ԑAe5vAkӕtHWHW!(S+3%pYOfJ%t%(:U 씦vy2(pdA5dPP]]' Jgk祫3 ق?ށHWzWqBt% ]nP4;] JG:@2n]ӂ?|Fϵ)ϋ"@kLnK?G6\t"-͝ZNdZ//j_;dD tKn*A>Tt l Ψ5q*t%hY;] JgtutE186+j:2tD>Dbhp~=NS{8U\XTtrТp*:N5{>4)U}<@_Y JRW #ONg;x7XY"]4 ѕyFM15t%(Q]"]( kǣ+D<n2Š{ҩЕ۲݉ϿN>J~Y'NhY=ZPYznrGzlkY ѕ~2t%p{͝:myCWkDW'DW3Z{u%(=;]p:rlyͷ_N[ޯl\.Myp+^-'g OWh˵C-/UY?ǒn E^O$k]śVKgg%ݤ|>4\]9@7vJg뛋+8BltWY4m}5%Že*7ns&kqpl$_q_oixoU/^y5"zMܳekfop۞Z|~돯㞬Ćʏ-F|@y|a8O!\Mt Uu7 6FK:,Zx{I5K3uZysd6CIqiJM#be-|!1؍q ٹ\L5CёBE{))w <LV_n@uphQJ=@ܽrVd,Tx2; 4ޅKCG#ئ{y6;. oTh]C>:x!?VW*#IM\>n >MaY /`ɘ3>|jys*ګ;9SSI{f S $]۪Tw$ף m !#5wX7E|If)HHI? mBڈҒ<&Xr!xQU-yKBCpڸ\=D@ Xēvڎ-Rէ lHFu)[_c cf+5A ղZnB Qw_Kͺ#oG0L#mTȗBi")K>#(<ݫ N7h NAk!N4u/uh;g۰l(*Gu%Jr _e]Sf$X<[]:A8Flue+y,q!YGeQ\0-!ձ;udz=s%Ps l[([89*|b4:dZ{J/G UhS%vdZ[El)h'SX|/H1?&;5p\ jਗ਼X&Ef2!A6p.JqL +QIw4 ~+*өPm5 f Xmd*zb슖5bze] R uWh%W&q2F[@zLB€(39{i]UtFtQl<t,b@E&rZ%d^Se( LG2P&>y/t6@"G-A@8>UUBN5~d}LUFƃ d-PBpߩa&B?$7y $di*S&#d6A@F4=R#A[y1j"!%:]%@_>g۹DTeP 0F%8CܖSBEKٖh @k^WHPvyT @Ƚeg;VQ`Н% @GJحlc:LMUQ2P] C VArlU? ʰ*)eR]![%VK|p?ZFHG,D$Q55Ԁʬ$޺3JP)J]+"J^X.TJ]UZ`O<{U]*V*$p>ᏚoU7 M̼s0 89sp $o`uךZ`X)8ÀPp^aQ@H!C pT#.b4L {3l} :QT]xLlE>0*sBDf-lM%dx.3JO>Zs=3Ɇyt1TPAkO`EY-Aْ -DM%@rI zʁVBCa\,%8 fYBgxz`+=B=? S0E`;1>dPzŞzM$‚$e C )C'Irp/@`piP[XUWɇ(V"R몈0@m V@ Yޤ!F($ ~%^R0[@$X֘p^=tEj!8m5bW nPLVa2C0-IU&ZCh[ &Ցy_~Y>ǬZh -NY_sgۛ%_`[͇*?."`Ql29̛vB)g1k5:N& j:.ӓ[ ة_5pr9Ѧ45]|^tIQwQ_u7|γ m[dzAø-~Tj ?3s٩dރUGrsuh8ɧV/&u":jVr <x=]Y,i?aadY<4оzxWbTܧ]·mi3E` />"s] V)K|HvuS TO 7w p@W)~-wq|7h·[Ƕ[L MZ8߿q(C gN8$r Fk8} ,rUɲƨ$Hl`+Rn([qv*os %i|*r`|IKi3L$I#IIHF4$$i$I#IIHF4$$i$I#IIHF4$$i$I#IIHF4$$i$I#IIHF4$$i$I#IIHF4_/IZw}L%V-?[GC5ٓ9dH~davZO`M%>DhIrO?uju+<gcW0-WDM zJ#PAtJ[΃,eYmn!ySb 8 aaRі7-rruyn: y _F]?O/wqn}g z9O)6e\*Om]5W6Nd W2Kc@!X >) Dge)g˳XVt+aZTqưB Tgש1!>Bt5Y Ke.:e ۯ^ɬ>Ak.dt1_}v}e='`ϧ/&ù_Rc?Lb}rm KOŨmu-*\.@RM-.CUw\:vviB<ґ _qky bTL*Jɼ;́JevWu'aߛmP@Ii׽_ռ*|ܢ浒a:otwy-tJ1=T<1{FWv4^NU[ɦZ]A+_BYf 9fǸXTvNbH +bG}駋wK߭eھe\elygx;]gݚ0V $~q[[ۮ5o_6ĥnP%B/m+f-bt`ҁY:0Kf,t`ҁY:0Kf,t`ҁY:0Kf,t`ҁY:0Kf,t`ҁY:0Kf,t`ҁY:0Kҡ옲tJ-=,0=,\t fxY:,c#ÊLFX#C͡DdK&oJnkD2u8=P:ܳἋ]^0kSE'%ekU+( /g8-SG!=fMÙ:B u"Z{8 KcGz,#hjݲ̷8H6"|lds$#H6G9͑lds$#H6G9͑lds$#H6G9͑lds$#H6G9͑lds$#H6G9͑lds$#H6=dCC ?x˸Q8ׂ뻀@cΩZ=xưdh}Lx7ÍixPƿFn< ^%zOkwޞbǙ)#]|9BТR$5D)Ǔ690yɟ7s^ƜS=sNϜ7J% `EqQ m.PP:%QeV7t99ߛ8󭵞ψ7Xμ XC8};_<̍s_, TUx=:miE츫, ^9$$MÓxuuW&sk>%8';Z.W樧N;jlb6p &x4IgT,f'?N] RU 8iT5HYT !U] <"Te[8hrzt842*^ `?kt+mo pp0nK۝u#٬_ۇ'gCR{fJrZx=fN`TQࢷ=nI\=< aAܙ "y"#Lnc8ygW=@Z4`) %V)ȬT'aĴ՘V?״XmX44&BT3W(8&Y| Q+n"*iqK<|{oIZ].ëN6@]̶;19;@'?5_TrWGd)9r!b6LH(/Mtuܛ8gV?d2+;Vώv~a}Hr;h\򉐥҆84ø:)I3LTMRH2^J%X 9K3Jh-*|yk\웏F溔b2e.g ])r 9}XMyq8 , 2ax؜-}%F9?"n(=A{QX<̠\]J6`mwo~>6ǓsU7۶ox(ܵB9gLmYF7.R=KeZR-w8 #<%r4Vg=8P?y&ZA tA:t@i7ݲwSˊPS`gDBIl?8NfB*˙\oZ}}I{N̋FF@]4~)73}D\i?AM(d5*u,Y V>RAŃ!5djfEIY jKd=ьrN%ll0ylf )5I e \ Mâ,- AEAJ"4LmHAd8<{Czd.hK㼊M^ظ_ E+ɊĴ7zsS΅׬y21S(&U{.)ev`%]pִT"G&̆q&ſ혧َ=`3>=޸B"&ΧB%E.Ap2"z*[0}W; E?G鑅/i=c4=8W,E彵\EC$5x))L8ʅRt^)=;:v#OS DV1*'.)dqTUr>dRXܭg ~ vQzFmSr1?Fðo(m _}0N&X 9{IT \(ht;.PonWEb.3 Q^gVҩ"<Ă49#L" %b>%E ,{9l::|mPb5)^plEUJd*G w]D"0Ŋr2td3-ϋBNP*4$.I4S|9^|9C7eY/^kRPQ7׻ Z3'Ol9^)I"p#x6Ҋ"i%GO(KS|bVp6W8wLMbǜчPCB}S7B dORFNULyp3RT9ӂ(4.F\% fi)*T?NJ"[jFxw0"e6Ҷmn pɬ%]6wNۄ u uv 7b>Eȡzx=(H[k-LY~@B̓ ˈ,fZlr9/_Kq023tPvۭF:]{B{=B{o.@GЌR`ݛ-źB_n;㛦9AEetW(|nfc~n+C.4Q`#QC/=X;7,p-](ޛ@]0km#G/m! ng0b?*&Bl9gx_-Ec%8rwYͮǯŪ'wL,8f| ';V#9na}­U~~ꚣTwKߌ'vI?.igl_f!Uzdݬb}wI'v] Yvll%{={ڞ2;R!x06z!$*%Iɰ#@JKmqUR ),mJ)kupNF"@mOLF6)2.JZY$[tA Mc`ڌ57-`RоR2~vR\:?:=|sjn_Ox-W˼YrY;zV'#4X {'U[xtlZ7\3ۋ0+gI9d|Do t,Yj]nq_*=C1㾣 ͪog :B9,U&*JH䤰&9ƂAh6Z־)Z ٱð|wLZ+)%RP1#@ "HB@IH[  M> € N"Ϳ +$b Ȱ>$On)pGkF՟:3_friXcϭ1z_'t01%pr_y>ooF]~ ۉl>F'Sƽ]yգ5!fg^'^xu ߩ~IQc<)wg3^ &ǓNS{HF# jpˬEir':ՊO=s|4> ݣ.rhԡghX:vZHH}r|Ϧ4:.4;͛pM-g7OfGy=X?o^_́T:x%[? RjHpo~AރC_0i54Z{оS'| dGޙH뷣x9\3kG]h+z#\|f1?OGy~\J;K*īub#_x) ±Dp$iw#(})*( ipD`@ш)l. Y:!Gz6 a˓'Ƞd)ԒHABEDvw5R}ݚ}t*ɨOld$\$6jf֧{wεi;]=LEpxDcYW DVĐ]稔J]"OMr{y6I$6UIEybq;#v%&dRy^7VZRj[.0)pکdsh څd(EA"ә =DB%f} 8oi{AWQc襦(k$2o{Jc0B^ AEMAjv$e$%y 6XYcm,%3UGF"c-h ݂P<IDp>eK5&T5|5a՗R +1c. h_ʍj[xvebtA!dH)A6@ZFK`0HcE NƶHNhK}Y{m-v \p ÃY2oG~csWOy_ /b{~#Y|sA% ïXg't/r_q~{Q J=,JNY.zl<؏2JYr+,#/G˞=EyX}L>Pkb(&K󆯝NMrZHs a.١/g~(.XѬ"d2)q/e]ZUxl<->{pXBR })&:B+VI &VR5c3rHhUf.7Um N>xeӳj+N%W/Lϧp:< xxxp<=5vIV:&`)0v1#騈T(dH2i,V3Nh1M5Su"*SRZU5T@Wط>e`K"5v3rVktlFj};U1nFljDX#ʝFi;^LY.%/}f/fAǜV^F! [ mfn]A` :UI,zސ%-|I+ttdbX#6#gFU'֋EӪmu6MnN/}(U6N\ h@Vcm`5[3CŭYiۢfT؊N\R/ÝVfi}:wڏ ͳ4ohlalt;yS 28p}Rl[ۓ-O*CHń F2Nyh#R9+V6CV&I;]rz猑FƖd=C+ @"(,]LNKF26#gD'֡[s(Emxl_e^>6wルv-ԛʁ%jhW.쬏R&/C >!0H R%&b3˻6܎Q8oSְO-Rc9Jй$PT͢ S6(k4"].DV6ɃP-uͲ SM+\F≜}Y3r֔-<|z[ @>s4mg|h96',!k$ vARJ$w"R>}/?WkǮM F,N xA3=dfCE#zkA|zv~Wrmd^d`Vixe%lh $uO%eP P;DqlUf<::f `d%̚2H6JYʧZ&pL來]Di-2h5{Ȧ buvaJA(1Zhi-`T"SYGY[ Zwxc[l*w3gӓYd/99AH@BFYt"g)£lr͖ cjcuV䬆vE*ټMԸZ}߯r۩w_3S?fƒU8#H?:b31=>8ͨ؝Ā&;`5@IPԒ7IJ&%H+ ^JJN o5}/ ٤7^svx>Az;/n$`=#z6ÁMr(S֐1֋ڹØ$ L"!-!XF] In -s5C|Q n@i2:6"FW͓>ЫO>L?OR#św dzXRt4=+>rEa(6l|acEZLjzr.elҗbe3VYvawPq*ZdTfZdXJJȨTZk[dXa]=#2,}7$>N!zItvRZfկa7ܷ! ?y _Z:8",9l{n0k'h^0f|((PCJ u2b\cSӺgeuN?TW ]1.&i]WDiκbt,z ^ځuauՏ4EEdACW.ԢU?tŸhuŔ`F+INTP@7>D'[<ݯәOoWk5m6G}^|g~o+ Jei}fUq|3^}UB)wO.JesE1UT{(j?-˜J]&t$` s7')Z&!]im#KWNEWDk!Idt&j27IHW&]֨uŔAxte o<jHFWB+ZƮ+k,ܻx 銀qAk3YW#ZJ&+vL3.&3$Z/ 2%d]QW'4θΥ+jyI/|@U#g¢ۡẁ@~>2]a]aթE 銀#lp.]1O鬫J*%f r ]^bgu|ҽKz[iԟ;^6^֛ ^)% ]׵Ń22s1'ɘ×hI~&m֕nM|=}7_Ftf4s =!^V;߈R۩/f*ESWRkֹdPt[ǜRNR> @UTAN;,LR7Oi$GVc2#9Ƶ:b#9tEy$7rTh]*"Z>\WLYDu5]i@Xb:듙''Z5 ](Ⱥ )|B"`׫TtE͹#QYW#ԕE٣Ϊ+\"V&]16)ɺWQc+ǛS3θ6O(1cݨ\FNqLEWLk!v]1elIe]=@WȢ3!F0?.|]Zl;~*vCW>ԢaNIWlNFWe*bZ'bS1J tB$+v"]1.TtE.v]1κNlLoɹ'7s\tۥE`ɼ$Z dwuF%o`\R7mH L)m7ߠ 銀A3+ø&ab2L]ue$%+$+5:]1Pڽ(]l'd]=Gއ:H#e[~ICS*ewO_fyM{ڽvs=~Z-qќ~{g*bnj}$u;sc}ǿ{Op;by";5ʗ`A߮o߶_v=wvOCˋ~b~Yl~HT'@i( XrzFiܾ֢v-g*\mz|[/&MA'ܮ /[K+`\{RM$N^>3FY'|Um3׳ӝ9:lCt6;M\-m.>>kfMb}=yv2mgb1t՘F?Ɣjc9I+JDMɔg]PW()X'4Ƹ&njL0v]1Ϻ<-&+NqU2L;(;u6*q]q\1(3xW?`=`'1cOZ=L'ed2x]>uuJуBB"`z'TtŴbSF.qt%uOҌp*Υy+xZ>VFvG~Sόu+"F]itEvpQ%Xh^WڰF+c}J+Zv{Rܻ6)]3 %3dځN{G F+gyctpHwŴŮ+Ԑu5B]LHWhp鄤3O&ƓhdNoxtim2b\h D?dȶAκzȢpGõ0`h_?J ԢGc+$&+i?Sʺ$NrIGTIq.oN<}/3JS!H%o`\4Ao`J0~BuJRAdtŸҦ+5.v]1%JlU2b\Sge@RtE*B2+U:v]1ϼƨ+bJ"`/u2b\uŔVg]PW[D؛t׻TtRŮ+̺ГRku*bZ~0; Ϻbt%,z9Bb dõ@KQƶ5+uujуJBBb&]]WLTu%AH?ƣsqU2bZ](uY]yRJL@H)1Cn%=K2Uxc#дӐOI氖^%k:\ KL6y\ڋKIv|U 1[Hf$ǸΦ2cZocy|#9 V`J *VuŔVf]PW4vA'dֺuŔ賮F++`M/`mD2btuEFd]QWNZ)銁m:Q"]- luCRbw釋.p]1%uT6vtzWu*"Z-dJO]#^[i>Fu Wہ{WhNv TթEG="`"]dS: (QAu%>ZD}anX4V޻GWQ8ZԑQܜ#DR+EBvw舢~6aSb7ߠ {8aZ)c;2Qt5zҕ%3+C0P[t.j2F 銁LFW;hƮ+>jƈc=Z\D?wŔ6Ǩ+g1 ϧ+~sp TtŴ:z]15YW#Z*%]10ɸ>+r}R1IV\hIeÛUjl|כ*+eb+Ko 4Eo7\NwqߓjnDOGw]n:CUPinj}6M5TOwడU^&ϟSr>o_Y1BO+]7SNMQWa*"`YsRK km_f="n뫛۷nڽ.M8B:n[|{+.q߳\, )W\g:^ B}&M޳T׋z~yK#K[5Zm ТRhQz[Ua+SF7J  C)p,*ɑ&Lr]]M_陘Rc΅+]e9!T-xԦ $x!N~JK]X@U6L'pr]vZ=|ͷT /mvD5|S~_jzZQGM~ߑZ/o^?owK~bλ= ,"VtQM2ם;_K`۷|O֧~U\_.e{pB|S<$>WbC -QP cej}0 QMQRllE]rroÃcjp٥FŧB)(Ci*߀5ec(*#htcRQSu#eQ˳B4rRIЏ@Ji۲s^(]7Cph-«3v/u**m&\U\-%躠c.$T @ TF9 xJr8gxo*MYnB򊚃nj4U|0튢FA:[W.8)w7b>rg ZSݢY\};]MMAwlYk [:*EM-JYҝ1!؄>7bU~Dx*ǝ%_b^uf<_^->'}z|mF[oj#Πo*)DU/R%.ˢVPԪ7#Bl7ׁjJ񝭊̍ ò&eӑK-@ZY 4kUFT*,u|V7ՐS[?sc*b##G=TލvOwey~q};} +JhlE`U,w1uU\_z>ه)5Xr?Cbunਣbfv) }&1|O Jv{Vsﴺ7' 6v*\/7wFy n7lK˛* av҆}(S4t}v{J#Bg߰s1#b ^BR(VO)a*7uiRJ!)g o!) I9e!)MZ6e#N\ҁ.uUM7$ U@mmDBSNuWΗj|@Ebd3Sφ~jн$E#؈_O{Hw̔ۺPu%j )RԝW^I . 4mQ?{WF\S*Gʽ7JnHZӔ%UiPDɳU̠i4ucMYs2IjpX=!E}Қ521 L,BkMj8m rF<W۬6,%VG͑-5 ȴRlP[BhC֓8,g7f8%XUϯr!?Px].`yroOlV#>"I BX-ɐR"@^[ /Z}0At&oirh[;;v ܁'r*ml\dh~g.ocjD[PEtAԾKEWVTdТ )19A\|nFO֎}=sp~N g4g4?T4úi- 4:eu,"tl!E4x$I1Fb9HdMQ^e R'1%@1"FXR :l-n(i6F>#0".'4^ǘ<6zSM]_G~s< ?X\4'>-xIʹUQJ@P_4!:|AyV@{PW6IPB&S柍?1p,31N89{6L jrUt̞8ΕĪ4'{ƪƱ*j_ARMTh}k5wyN SU·~W;E50u@g%bQ.E˲+ 9_]AD^<-CeqXBR })L4 Tkdl6EqfXlf쉅Z40`AiDf_*39nxzv4Fؠf:]%6n9f$ ŗ옩Ld3۫L1@S^6zb/:%UMm]Щ> ؒ|cn6툝nǸ"1͎3Am-DT**GiO&BL1eۨQt(SY􊶩4.2]2X!33*:kr5H"[Egđ1? DfQM.>l8p.ٜ?i5?ED倈"nLG2e %\J^뿔 sZyp"b^ _+ljhu5 huRU*$6=di&3+iNG'!+6ƈl8U'i0su6%nj7z}L(2Zkr)4?44&px6{?fǹaO{R}|)oqӎ`OcAvŅs5+ JZwt|ސR QDA긒ip}" LkGR΢ahf poɫeŤmYŨ2 H!,:36) 3J`IhcuQl/mqxuRTME2?pd7TuWGHT-viI&4#oiWq`G'bGW21ђ-"G ryа ~W X)5 #T ]2)tE*Ur3,?IɼjQ]R<5x]_|tN.5@Vioi׳lP,8ݗWnki|0}ձ_?SqƽB}?pF!ƾ7l9~*ֿ&acecVzYwk8ޞ\kOSző8UPc[ex;;ճZZxGŊyJwSWRkQ;+Tse:LKnɷRcgS DH`Ɗ.H* 0y Q-盧J@E,JPf"jr>*J 6iSNւJm_JTGOdBʲ+2)m0H) DVց/#.S2=cgyQgf͈=i_H9~}? 8O^'0{ Dڳrʲ|7C,LMښ1i \. %Ec2m= myݔDuuD0ws*nCrM(]#L90hR,]Р`UE3RzI.޼ ՗QmEK9f=+{9GVY-1+I(XJdU{LLN%Q-u$yg5 aN$av޴01_u />_̀r}I#Q)t7$}+A^,[^»w.q 7mZUXk7ɥr֛ܺ9nf%3ICnqeݻx'OvyP_˫:˜);y.:B<{|$q7J}_6ݵ&.v<9v#~C޴r&m'ksOOWͣZ8=`}A':#0v s/P<;3{$L2EekX%Y!LAeLK4rMX:(fDpTJT"*HfN+Æ5Q$4@YF1V݈͆X˔c`uuprWyі}[քk)lI߱YBu72hiW 2W?'lfk(.i;+rnǜ|V%dˀIA*Mv/SvE oeaDGe "+(L^ׄ1&NnlS`߲1 9r~9Ɵws77]ͷ6.D8Bz=E#?́uƫAA:fa~jeX|@ښ/81=ԋg{ۿ l v&{&<~?/kmH]\@uM b ,bxH.T,߯z8ȚƆe3GM˭գbf ;nNV\Q[Y|'V*2,sQq~w~]0 ͂},3*), Fd`V|Qq/QܙW]k^%q`;oVǞw' khF=^YRұWx-jyo;7L~d2 S֌Q>x.p|l9>[1iLGP$p ȝیw5])4 P#C*3ho)1MRMlGɭ#sD;X֜Fyhݴ/h-iL ̔! ;8dqK "X&/\scwR=Aaȗ[%<p7ȡH1[gK$J" \ؠ`=F#eqH0+Ԉ*C1FZXHuVR1jp&Ícq$-Gkn, X@-L+@w =$,1.1pjє L(LQ8J`4 0߳˵ΤFɟOښO o.-v>@0F/.Ebg0FF%pE*f{ޥ{WFrAúk.ܳ/%a짓!PF; \A4S^ J1u}ʎn(F:+Wo^E-6`M@ZX.*laR]7nCU߿K:b)oqb|=S-,|7)hxoV  .WӋ2I`taF~b!C0O\fnͨT3Fʷ{Sh ~-BT.|_-/,\q _.[ۖ] ƓOyqKPiGa ChWIL,%]*.FŬk>'?A VLdM/T=BoKaq 2z߿J?_x|ū0Q^߿/ҽڻ (~6_Uϗ?P4h(oU4U옢U.p:Ç[M/١&bQ㋱v3fXx9waEyY1z>5)"o6* OҥS08EH06+4 3ǽާRI-Lkݚ9k:dTN v>B>z.Z3`jzf&oBv$}Je}} AL˪8v]\mΒ*gA=\BYua=7?޼7+nyhMwpDCIĘ7e묷Gr3v򱚄T٥ R^Bڿ09ΪZxkN̫6(bWϑ4aUEk*]%񱫄Rw՗CWȮ}D$QO: 0O4'x4aWJttЮDs&Jh ]\d[*%tP64\GWz-vռ-4a%lhY`);kPoU-X#Npqkʄ՜ꎦ#MSF-71Iሴ%9iI5$<TTL.PkR ԳK"ht2WDu|R>rc{EbYXWP=hQt/J*NiaB1rKQAaO8Tg*uSI8/>yWs鮃?LRoSAO]2won/JkqpfZ.;,ejU~E~+`5 W<el, efzֽ|-~@3~Qʯޖ5/EV(9&6$|^=Wlfճۯ.|wSnJ,۔RׅokVIra6\\άTŔ88E0 #KƢ)a< j{B5YMpp*Y&7ͮYe/ b C0!B* S띁1!ye4zl5ͭyZmNҋ/%͚z[UGrԂaQ地n'Ӊ?\pҩ3Q6UXyṁ9t-}A{ςJΙ(.r2+*_yi $<oUDKD z@|#Z2z$h>ץ]0ެz9҃z]RԬ܄y<aQWW 7X03|s~ ̙:)sz?gY%8&cL!,2)#zɬGu{,\7?FHP,D!"s.'S Uz)B9JqTQ( Ȣ`;RlT+\$JGFR^90t_vf2Xy>y9c:3օmL'21ӭdvsr=^s/bԥg6eӑL2 sЦO5Re2̘7(S QTHn8(#6@j{$AX7@F"xԲ{6jWI]d"瀢$Zh %[&sD$zfy؈oP@ yX'Y㭇89pqUON7o~YDu?r߇acs.LLκSUzqTuKMr-컙O'GNbcnS:C2.Nw(u~#VhkQͮuľ;BRo\?(ѥܥ-{X쿊bgqM1gv^)XWkO6[)ǑhƱܪ8.T4zۓru~djRC{gth;QKr l{ƹ^Fo:8ҘW3w"&vu:dT)UrXuS7lZM|5/6j-f?|{J8x{~q#ė^vW5CRW꙽#Nۋ?[[f6VVQ0loI̥ͨ| dlJȞZpI$=ǬɷDSRNcK TrF0' .yVU/F'[4q8Qn{uH޼IWB˼,dg|x5!֚ߵ{HyhSWۭK{fmf/+]~XP]H~U,jWQR|PY{!›NY4 c*lvvíi|Wx5Z}% !0:R,iƏuhDoh4Ͽ%zX]MUS$&qKv:ЁĐj9)o@WcT}.&]w'zܷSڶ܋V9r޿me_osI6p<gXܳ=s8GNsO.{4?؈;?W}WX\R;xp1=BeUՠb\1 )(EnpSʵfQd 0cd%TY'BͶu/G9znGoφïG%qmk4ףsݻ}iO>.!rwJ/!۹ '0F@B-a\{CR}'H%)5s:^MjS3 %CjϘ%z6xStrƋ~5H$1ʩɫwd|)|o|Q$߼f#_|S6 Ԅ(1Y*PͥNG5'-h%Wm| !;4ws8/<=>ˤL1;r"gL(7X f A(5Z|k4?zn][t?ܞ+Q4c \߮.`+xN;1G:ě*Bo7o@]wX!#y@0:@՛6@im\ۮTMHt¶e;d*)[5dfG)X͊9W[*Ur=<Q\l=߯Ni]dlt~3)-6rJVٻwNt- 2 g )i%iSMBꮵKPZOf`.' b{Α+pdUU'F%g~֯XR C: 8ek JcpbFE%Vhj)玥1}ǘKBêC/uc^u#FC)$-#OY7p#LrtvM Q ")Y -)K7ƿ!Z ڼOEhXU`Z۪E[2j Rj)=\OGCitȀgF$=|_H{3ziIcǬ|1*\TqS9'$_O0 Lps}18 *kp/kOC$`, o 0<\P8Ҳ0IJ0uJjr+kphVc*(Ov1!, +M:C/w4jJ HV"-2Ї.fZŒHuk4  l52G[.cx nGd6 S,TGԷh[4gԶMRxY+Jb;׭&`v[r}}{v'K3*!Bkm-VZaL=蒂$c SyH! %y.p rXvJI e@ F%u$S1(`rosR 8^A25{{EPN`#S(;$dU8zux=‚y8Ř5@gٙnB@tʮZȽn#,ҩ3lӨ `-5Z]'Go4"Ea-CJw a52 Hc/^mFpq0$U B,l:{<ޜArqj\{Ǵ'gLTT @]I ތpi)@m ݷՔ,XuZUg^\ !kFlW7# /o3i*H]m%9X\e^A7,U HtF(W ̝ "PuAHْ2Y)waVFAexc_^|\mwȊbݺ;Om)Nz8C'/ ?xc5xWfgOc7v/?ˇ} !,{h&Nޏb}WKmA\dk#=TCMO:m طwm$G!#q6e ] F&$~UQȖE[%,g8]3U]tէ)[)|.~;;tkt7l{RoQtqxu>wz?k;[ogoPA[K绯_~k}X8,Pg!tY kwm|_Fk_=MG :ĵe<F% w oa_c~<8^}+Gk|fWC)4oOWw},|46>@1<]V(vCwWtm1 wP[1>LK3. N)4/S Kk)VJzR1sMk\0ׄ&5a sMk\0ׄ&5a sMk\0ׄ&5a sMk\0ׄ&5a sMk\0ׄ&5a sMk\0ׄ~k/ K[A)\b@ rZvH RBDuyqr; nLysVSk` #(I]r )UrX!`v~IJ,QɏB{*B\],rG,vi #ֱdgi&mrmHf3+S̫ӪL-QFRXHd+H5ЊٮMH\1SW6ÔS/i7_{ԃ4>@͙5?~5"K 2d!q7 c঴ ~+gVe)*u6bf{8)a &btPcX10T3g{`)WXM5L>ow /j]36_"8pS㛞0WxL"*(T*J$&{86YDkWQGģGkKS'L.3C˅dL"WHn3+ЊRtB8DFumyD4~vC|MCJ8}$:d-L#DХ53ba>b5-t|eېSxj^NNw=>'CیzԤvf7 ?JAx~Kf:IW}Ne cnRNY đf#rk[q'y1\gYD5kLJ>c_Fy!ȴ\ %ǓTfGAgۊZ⸼s:\{Ib6ҩ$ﻎ끤oNm\>^0#E$#R|7^_bN3QWkt^7>w}?v)^mo}ee2=X=eϋ_O Vכ4?AۄLaw& yrik=t..7h>מ&aq{Y&bpNyWAr:&n]QE?oģN%$A6̟blgIu5OuQVx.␨&RbJ*VKEU ZҖYmxfhʋn de}vЋ YeWoDSh6c fLRfG"@\e/DW%aLY-{zS.sWwܫ@ؤ2cP/W}8=}[% lEu8ֽoZ%y}D򮟉^V(K$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$DK$ėaF ]$QH J2! \=B0 %&8Ñ䂝z=$LC}:^ApRF[vނ`X`YW֙I_?sw 2蜹y|\ tANߗOЅ}k~.~}ztj(DjQ]NWpcB,?_8qG|w;;ʾܹك鸴vKT?Ύ||Չc4V1/q;ϞOveqd0z}Oη|2F1ג2-)}͈fmf.x'xqGzzj۪`[]d_}ʝzlC:>\& &)?Ɍ_>.jOcvxvxwowO߾oݛ߿yˆxW%pgAx~ӛ>iUihoѴtj]ͻ?jkrOG۞%ôp;׿|;N}Zku;r;f K nf /Y{V ӝUCBoqb#n㥅Ix_c^ܝK<*GO+d^xdwmkRxx2h!wRڳXL%x6>p'd!ʑ>b"5zL p<%-azx丢&J- {YDVIdrΐYg9 csTĞ_!;;GψfF{7kUĈLe;; vǒ&ZS1vvWݿ߼U)߼/d !;f|0s? k9cOku0+-+xeUV4~ ;,kY ЫKb>xLyԻe8Ά#8~͟$:i _{W4Si b#O0uyM}tyNE~Ĭ*԰=iݡSw-/\#kE[]R k;vRV>iÅ␗H.}UsՊ9j5s程ՠh>iU}k uM2'%p\̓VK*Tp䱴EKG9jtNASSjqR$E/E[ag% IXT(aRà%ZeϕaHǎ.lǞٟמ~K߫X3ZnZ_ ݋g Mz^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^Ez^r gG[vc,5]X)?<JY !Nc_#0W/yJ%9czOggqf|ɿDlDcyjT9 R{LB0F 9qv$y2&ᏛI#hGDФ+4Ɔڼ,!3TWcvRY"%R SdT 8c1AΒc}&sHz[ml 2drX5dD.8e!f= .69a#)0 }#"9EDE #&w %8zN)Hx`̲ѿcv)4W?xⴙ;1uV%EY9. 7;_׾") IRR fBv?D$  z,".Oq,P -_kV܍[Wm#o_(:yDg53\Տ6WgTV3 QƩTQ$cŤ"}uqvlO^}F& I2+0F(b#tj%Ҡl9W\-sv(tzD5Ue؂={+b:IhgGseqADǠ]{Z!Bl'JM[s\1U sKj}V?Bd'R"sHaD%ꄔ:E4sV Maǐd r4y9;YW~nu٦ E R6 U6܂ P$β4{^<'ZI.Pn?;g91ɜ[˘Hb```4^;lwm~:3oS 'Xl`AH;ǒ3俧ؒlҖVMlVbVExjPԊi󓚟Us̲d h0RB YG$e`nB =*cWc66'6\6mxhmL\FhΔZVD# Dt>% .X(l(yӥ 6m!ٮ.ыjSP峕o;O0gɇ :謀qQ g[@M2זU)zS06fȡQ43scjQ8Ԥt7.DDC$$dWJ6)6Smm^ʦ:eC_Ne"MܑٳS FHީ[A3|ѳQݳ#JI s)RpLXj~" d)%UBY\W>Q ]MZv ҈"B׀BW+o_  Vǥ[wkJϹZ y0 ڹ86tJHrBGCGL,ӴGkڌriYDP;OJR9A'., ;`m(9(i8Ug>B9}o!Mlr5t0yI`kk^Kp`(gST80}9b?$CD:3We)ĸV9kdA6QwE\BE7girsM{+Ҥ7&gi6YRf^Ӛ"L~}AWz7}ͯga"d =rX'-CشsX쌾B&5 fݹ_MfJoVSe[6coP0]nnחFP9oi#?;Pu!D⵰Qh/rs9f$/莙4bb`,GyTѕ9eӋlJq^,֠ե-P&mA-q]~Ŀs/j SZ R|hZ};#ީ`@w nW_ Zf)F[LI"po[K)h%ݾÒnZ[s5.zzB$kt8goӏɪ/0ˬO=Lד٩Ǩ|IR19̬\:&h;Ow;gt乯}Aob_?OkM "NȈ|8٧4]\)vfGtе,s)qF !Q:#X %BR &r6$ gc~t;CxQߝL{0HY,N֑' kh=4{ک44=k=4{{Oci~Rci=4m4{{OciI$Xi[J%o& +$i{hR=&Z\37\AǽwxjR'L%RFMˠu܃|Dyˠ{iCC֒y#0&Xa8#JsaU2RXy.]9Yv聳Ȍ4"Iw'I_5CB1CuCV1͞d1lT X  ک@`'F6MƵ״Ș$B ;L*WͬSf.IMFc)ο˱"?ֵb׾*[O^&AWS~󸊳igkֽE񗩨Du؁ Y I'Y#`;NSjDqN5 6 ,IZ'J(",ED !j{YJ2<8jR)ie6XR\>WEp26dN0ީQʁj 64cf1ujs27:'AKjDaK>y,8Ay弱fT`%r.H׭8\dVBR"!"RIk0l]78{$՝^2z3{I6OLSġ-:.C{ou-Nsr &MO*|ڏ?/SfdVywt Tٶ cQ92x<(|^(x bt6 OɈ%b\I. mMB Vqg2R-c5qUj3@[MA]f j o$*}[^d4_b_E#Ň3xC|kOZ iy9a(6&qcA3J~߀~w~MhfJoVSe[6coPt±P7dOow}.f `yKkD`؁Ҭ 'B{;6xDA#vb#уAUts(l)ދ.|`:HC Zb&Vw'9{r&I 9UAHޑ H<)jk$gh.tyVٺ4yV0q1ZB/Yoqo] W«`vǢټM@}€ʐ6`YqA\ d)-2M ~uTwv]8-Yz.=|d8rY7ùJ1-֘Ǐ[7>G>3;)@\Wra4Vwa|<#VenSP]_π|ٗ .  w 3R4I)OݶbsiO ΃Y4bG"~$> B@΢YsR_18 ^yk%L)Y%.l(Gs^YNɫ>ӝV_B4]|F +Oy E!eLFܗ!ZMtJ]2*q' &\!’č.iIrd;[Y=ϗyͽ7_.0;<}ayY,uԪJy.&;Vpss|;hG9 Xs)k%gAh0ե昃d#yL!LSpE[~ ^cu2_O3 @^/;^t=&[!mQi݅/@G~w$`3Ժ}Knp;ziOfzԲֲ>xm/Vb{пGZLɪǓҕ?߮2 9]_r/=O`GsRO?ҙWyZx,iSǕvsyRrs%E/f$+2 )_3s?GT'VkE0tNdB{i]7v qڿBކv6 _0i+S(9^3$E[)[RH9䜙99sG=inv>-ȫxC:>֗VYOUY*d:.ʚ@ڿ54, r-Ύ2f4A/F˛aQu:<WCS\2 *.hy.\}2p'5һ\".s ʕ8OH#z{yO;<' ;$rI"@F1Ȍcȉ$1) Js*Jqi*b #Z2OcE_g^N.ص~`h)t^- Jiv1C\DT4NS"ƒE*L<iI BDpCF,s u xu %(yսbNԱSB}JKط-[6k|V|ѤM;نVmʯһڞզhWxz.rihk61CsKaD¿澷[1\>Ag9-a63;I# LadH VE8̐#`uuǣܵlL:LXϙPµSkv%?Q2p<RdB!Ye "1APA6k f\'N^Iɧf4W[nt7 .Ů$)Z37Xk?ySxΨ~XXWUTBqHC@mum RndGRL<p+Fqd+flcGF!V1ŝSG2RabƷgmc&eG!XPL}%^+#Iߦ8= b.(mw@2LJU)ѤJTT*B$LiPPӬ%d(%fh?r(X H5MAA*΀-oR<sWٺ{$k| a%[ыns(QQ*XZR3 Z߯F{Rگӯfd WƷiVXXg޾seK9F]$c׽y+P?-H?zVX_gaULΜV6bmH rZ8.F5e-}b-J׶4=drب+#QW*|Z &aJÿgbJP. Bc-5n趖̞G:C{K&{v3+2A٠eoF*k4i* %uhn63{ָӭbɀN@t1="zDPOpGkHdJ` aq|P!k4[JTrP'K8!/ԿƳ|tb0954uJ/4 DI1 ` <`uI)0.Kܯ.pcŅ6v!/"s:s1Eh: Wrs~˅߫?L'Ғw~yFzxg^Wyt&TLsEι"9¡N˾>"uswHmEnLS-gwRL :s&BHDyt  'L 66\ (:0!S&a0"K)µL^jFuvΆreni DJ<rֳۛ_=Eӭ_=3[e5  IY1F('{ # KP@ri\!m::_u0OKl϶mg(HFǤ"UϸJb,"BDMÃn[z <uO?""}*i`2CVpƸVb$ jD #ZA_HAp=?hh'%F7アZmR8fI@rƢ T@3oA-!S8hJ ? ao"SkPΥ+~TzpAZ9gq_=2.Ip7p*L=8VgPFƣ::NݜdSva k%d'N|G)t0"93_2קLe4V` $ "%IutΥSwd2\Q9xy:o*X7P& -,IցHo`Ug B19<>3U^:KɮLFղjW񡓓\R))h,%8r4go30!oS >lF&?g+#jZguZ·ƻ䢹pT!51(pp~Q,Ջ]rdŨq:Sͧv h=]uCڻѤeT$C1`$Gf6`r@JkXb{-=($u}6iQ8G8Ry~٠ognpu'`FgO>}zw~p3L_O>}xO0R64?? ~]f]S6Zu9f#[K>,02fn.p+sw~Pp3v0I""0Ƀ& 8H(dF@@ L/-xt3'XơQg@4닜am&MG>rm,Rш5[<\F\ 9HQu*9y\jt 3::CCswfNsS)ikA<䝓} z\Q &Zۖ}|0ih|88~P~ Y9Gx'ߟ~qb_hgE%Y@&@d"׫zlp *: 8s_WoNjޛW\=…$?6'½u4]uG=in AwxC:>֗V5_~8`PYLLEY3 ;CVgCl/˼{γ_e sPƣA pl_K( ǫէsc̱|2vYGH(򹖜N)b 7ٸl#UioRR ]9ں]|[cR;ĬW逰 9HysQ ,.cm=e@u|jr$qWֹPdh߁P/GY4LrHkk^XC uޓnnx&w56S1kS|F[JvenUjɆ[#F+o#Q[cj彨TdڃIB27` M6,Ea (93XV8d(PKev ٥GeIz:}3Aeh&+MhUgR}4" wN R ZԸw~p7k˜|vvJ=3 \yvP֙"]-ؕծKSe)~1*KaW }cUrIbaW>za .YplѠr$99N -a' 'qS60`cXd !|D UBpO>n[h=`X=+M菹Ù"mU)IA2iV" hYa22XvGĒoA@E<t}ziSa(U,H||_WWك!UhH. ߾s)i2g\!s4ٻF$W{K+h;vμ #OnIJn}"!IIEJdVdVDd|qX:i?!E(ݐTLn7 nxv7F2}B*lNG]!SQWZa]]e*e+Nve`FɨLXWZI]]e*N:uvԕ$RS`%OF]er ?u=z*S;M+!+QuFƎ'-fIYyv,THZsUTa5#Eƹٸ`g >sO:xSγcuga:i>eggo X%(hM,LSL!+Zr-Ei)XG}K3IUbhts? m X~L/vƇW ޅ>gϙ?C\'^*^:^,8DcQ!bt&G+"S2j  `7I-t[&gSm5[)!%_Y/.^ ߺ8vXGbw愎`.O(ͩg};v<(?S);GJ)v2+RQR!Lq>@FE'pL0s*Խ\uYg[)䟵Q FG JJM$5!;D()Q- 1Z?q m3rl^>!>s< dcɀ[O" aK!Sz+ =n]_hU }<5Kig)x3J{m(Y ƿ 'g͖sKu7p#ȝ6߄Vy:J-]^:1]"Tz[!j7r?G?=Oո'Ed 4ĵ!$"{v*!%8ƿ.\t[G%*ʖ?%Qp~ľ0LQ`Al4\oKs&1L&Ǎz:ңrB v8L-H۲Z[VK\ *]$}DX3 hPii qN_(8BKNl:s*+;;qwy߉SBtI! I'`8"^ŭ*dP&$O^yv#vh(Xj#VG|A(D]ҞiDNTp^rB<-Kpklv+h4M { {'+_oxКzLQB C/, #ZpUZIc cpB:R3D& 99`Q G)M|B(vfKJB൴ZD191Q59hv(!1Mo9Ue`T<jpR!-۝#r|_jN_nF&[8/Mlq4;tx!4^g6evK*{ףi٧[a]Dn$mUN3uō͆soל{ [غn]T7w9lf$3ͳYv- lYܺ{A;B_s7ucޅilg%;q]E{v^z֜ t>٦fe6z Mȭ/5Q|hl[< 'U9F›_C \X (R ԛB* M`%\w۝v'zk+T$QAzAH{!'p|H)ȐD䪂U|(zI.g4IPDBg$52Zks vDVkl稡% JvAr9ߡˁ ߙ0/e|! B E)9ʹ\n7S pv3j&a!;joyUvj)1U 9(Y0y!,q#O'=qt9u͡ CL1+V!1(E#t0ꓧDA#:kD%4J3pW1@48/&ș7;/m8|_\,Mj_;BHMW6n\Z܍F@H]ԅ3[HJ¿A]ոǬd.. "[^Dy8ђ&|%$Rp]!!Y<BuUUO[fO| u V:j l kE_5[XhLmgieSa`\Z ʹ*Q"Qn\xM::c S~oG?"'MfL)1"xA$CDwx!c- xR%UZ(Ѧ$'O۔Ǡ6UvAW\OݥW;h K;KeL tJ E*f\rBN#E#d^^͉)N1N1D`%a PO s&P'TFB8Q`OJGQBwdʑi?sќ!"һ Zj#J݁-ii(gpjxG¹&‚I0('smFo"+GK`GNh;ӎWS=< N(ISH$bk94sL"(2*n8i,2 ߉ kWt4Ox ( 2 urr79ʀI&d| T ֤~ }. 2цᐍmD=(jvg#w'%{ %n Q3}jU͖z ( UYB&4x%^CNTpЄSƛ<6 2*9οBP_+i09RSMRmNxHeuNiai>A&qZ(]Xp8&&OJAMnYn2_%s7\8܎#5Z~BCʜdAk~x9ke~Y7GG6"Z>i>&45Cs{kBY[<Zy;Z.bCee.*Τ5 jbZ-gDLn?V򐊭y ^1wZ)MZ3|#vw}ZٚMĵTZ6O&ԦeÀ-Û? k?Tc4:ߣUuL2^I{9QRf?{~Aw;& ޷Rͽ0][{Լδn)1lTc]TpB{SPܢNLw-%@[\f#1BIk 4LN{vu ocVɓMzGygt`Xhţy]}LDS7KTi%.6/<r!!cQ -Ж AY: )P&u 9'{}-y=KPdSē18o 2(QiŴ"2@FfQ"46s2o7ɿ" p-~9=`~9A\W#%+hFF#*j#$4jV~Fo#_?k/^UQ^zn1r:ꅓUSii^ ]VhX{oM-C і<$FIPzeZ4+XA ҞpUoerYDK=(ً. l{]{_=^ioM֨oDr 75+R6ڔClFYʞ. w*[UF+(_S@b0E9r`oUhkV;F_rNf7l]Kzq's9&P h~_-6:2x42/~cf|px3 -+R 옖4LXFsR/!)#߾7EiLW) cŅd沶[V$te-#n54B~5٬q*J8VZÏPxSYH.AՑ1FGmpڪ|S mT U.e5(V~ogÊ #sJCxnd H6WQ24kk kb8"s= :c:L*'N|ߜ4>k ;nYU,2Do=>M2chӅN fھjbfRTmLYLhZ_msr TH^ Gc ALqW6X{{f{&ǜ Zա,bcxPr:tXp8?{w\;DiPrf=Q4Doq(!T?Lv_|1iѭ~;Ҷlj3vG7{.=xG L>÷˶<Nj<JCNɈEaUdIL d`˂EI9JCӖ?=mm"64|WWvG,t[}M.&ģ6pxp_&|Ncg Eyh*ne^Ɨw۹eΝ[j)e˃ڠD"FGMj+QWzTaK**5u~06ĮZ%zR0x34l␋fE3uN4s'XhfVgGpLѲ$@EN%*YZ+e8ם/.x\ȥ&=jvᘅ.Z{|19ozzs; yAu0|r:_yXd(*7tlBA#ŻsL@CP,Eb%%) fVW-FS;Vs Q0|mD:әњ61f-ZXv%زڜFCUD+%Uh"܊v^XQCTfxZsg:Ζt6 ]l}yIR`_B!Ik@Jr6~I7טL-!͞z/ _2-s״qE!^)k1 ?8&!M+[[>% JM*{;io'}"BAU!R v`K\2EIKY#)BK*Z,G1n8nVkۈirmʄ!֢bYPLJhTHCNd eײ ~nSryhyS!n(+uTNas6Vh]@3Ajb H&MΦp+wMkel(O:aȮ;SGegˡ7$ -빧AHLUB>䙒M}|σ`y|5y< Zg|àaZ{ݏ^l'sU?8\>h2 IT][֜)>[6֓fNN|tk:!=`*R)X: tlop)!"*ٕĚr-sBJ`ML:߬^Y|wz<+S-_x :J"ѹ9ABbMfrB5lMOsSSʄ(%UZ*l [ʏ@JՕ$e֔}Ex̞[4ԏ[\a@޴-P}\|2 :RmZrPLWSFC"'M%Q!IPq.e(Dl"ilB4oYԻܢm G/^^]j\wwű\>Z~-lx(ٰt.G<=jpʀɄ!SRCMFs$Qv#Oq~Ns 9m'?㜊VaԒ.cE iE".X8`cNŪ7q3'fgkDf&SI9>܆#VE}Q!՜IfJ!:;Jy'˞yIhXRoǡoÒfx]e-"hNGjňQW1C%X* וP>6P@m F~~ơ >赱&-eU( 8[$սiYÙ=)/\CiG\__0_gϜc`rݭBlfֱZh*q!"irlLf%䬍nl8L_u$_ŶS!V1ZCg 9v>5s2}{aDQ{`,bs%͂l”?8m66ڡ@>uI^շO'Q,XȢ] {5Ī8:'( eSEjw\7p燣Qbg3'S. ?vЙ#ž#9BrԞE FJIFHJy(H*ת %%yZUlηTjh VG]Bzٱ-C)bIk&y,;snl1-Qgዋᒸ%Eә/=_;!r6VKm"hڌ_pкJy/N& ;n XغHtCs>eV&p:"! rF1f=n~|G&UoU=/rg%@e.og?G!*G\Xv@Y"R0! D1L>i>U(ȱ,f#2;k'z;!a, _T|/%\ȉ:"DLA7/CC80nݷEudCDۧdkZŘ*2{ذj?$@7\mSw:4OiɃ޵6r#06m00Lfds$X֍,9<}+veK--m`VSd*X]Wz*K;6V6Dc{wӓ,$*P;bNܦb:c8ERLN_c 6qⳔp(*9P,#~[0=gz)%Y懀YB٦I"B>eZI%uv6W:/; {҄}wAavx ,3tnՠx2.A?yhѠ&5W20 _*PtjQWH8x6X0 wk7tT*hqKJf q̩o><$T¹S9 #t4  -23&^"/Oy!W>bҺNZL+蔵nzoO'nGqUMN\zX>f/ ZݦMJ|Z>~oJ i|/vZwN`nHg_5d ЫL?!aO߇o,7u.Ԇ{ZruF*,QB wߌt JS znnbI0*o攑 R0KL V`xI.Hs V0R$dQ$k3_+oAG4l4 uh4H h(&\4lACՍ?E`tZm-zGKIsB/ iakJX41x&Sͭ;DQt:ZLFxpy;֥ _e ^EW)ь? 97} 㲩h21"0*xrLm-´.%ܥc)֓w݋$*9iMWXސ;8 mF௙L }?4?e~x4-h2O@(Gxj/Ә!aaIiѓ6d[mu@ Pؑ |݁e90Д}nif[reBo{K&r)m(E#h`JX4\FŢIJ~ PPkѼFQ&<"u㩖 jr4*QšDBzS?"ukz<*ٱD:ʅ(V]u%]o  J|*Qp תWxAm}opU*&[3%'@2|,(>\."ȏӿK7FS*sP@27+NlíkCFVTljv*hvO$ \JlB OcKl | \KLR5Mglgڴ-4ŽLhb[hYM8c#X{ ,2, H T(s hca{7E RrHd 3@wZI  Ay(Ac`(`ܫП [[a byzKYZU@eqH0+ jD #@Z/b[O_Ou4$,Fo{c28fI@rƢ T@3oXiYA&#b\ c")A38Q{X8J`4 xP .Tgĭ[g?8% g1ځ |ĝٷv Wo٣0,#;igצa/?l:/A>UZ2~pf=?y }_)dNS .iޢ+)(ԝ(Ti|l}3 *:mqkBͥCRUjRgϰn]w9y2Pogbq1[?7N!|nMFŶ6|+3,~K*%$:ިH 8}ib2 "Cf?NsUh:+ k|Cz"0["T!5#^\Y+OwI΁fK 7 m+Gb|H7UÐa4fUރiPfJ<]|MwfrUJQjX1gRy040~0,wr&8{St*'~+;Ǚ^;~w?K{w/1Qߞ_-8_ U$(3 D@y%KCG ͚ƛM3jr֓<.G)W,>90fnvDE_oȥ„eT\H 'r_AofIU?0;3MGt1jt0O]_UzL)tZ>xj#m$aQxJ)фrд5ۉq C%W\hS5aHOl/p*ɝK2u6$*[=(nGh'c8xȰɣF`%Bq!^Fd1iNgwS)زNim/S2e&ʕUmugN&Sq$b`ةzP҃9-rG0͙$W\\X+ Ӿ 9N'g+Z^Es{"ሎڈ bp$: m\ @ RmT`M ip<)(fI}~=((4`^$*Pc=omT@U ;-Q1䄥 ^*m4-Gʍ Hc{/ ;e?Nӎom,1[6;Q[c(!fH 6H^ ẃe5X1VJذ{>rՑ?s>_BPj ݆, rCY=@P,mXBwU[_o~H 3gKHGr 3޳&s-nR;cyXQ(-cΠ:>^{s[yq=~ ,>N};ӶS7i3ZeFUhj-z&Hǔh-,d.r ǒ(Q,g.JT6sk\$BX^vt6r& d>n",CrÜ9Q [c;"mC[6̶ֹ*5HcU`ED5@U4 h`F}ٔsL(QmGs+:2,"##Z) X+|$ #g/:&jډ#+_]q0nXLmc&FmzCԞRZP7soS:+??iA!rFBB,[JL.0"4-ɧAA|2ĉzFcfc;HhD)l)#4RAfeQ%LA8U;1YMgʹW橁̈|›252(%/c!(E&DR· 256Riű1k5};Ao/AvǮ3MYM6Dĵ੬{(ޝ |7c(,sMʸq2"zV2[x6gczk(g7<öEZτb~Rb:t<1?\%'Q+bR6tN7 kTS^ff'L3}{M3C&r{J$sTk'pra'wVq5b`ȁBPlegq aR/1",DDN#&ZH0<)c" _4FNͬCk\_CP(e*]5Q{Mۭ*pw$9yY{ylz,a~TmOʓh5X_V2O=y97h-Ss]/OظU bx@#Jo iÈXe28!IE xaHEzjSOSsW[[;n&|2Čt ,b3qsiXDKtXYnPJLxQ<ڷRB"DHXz&&с10B d|'uz$[qyf (4C!MAHD*QD`|&c*xЭ7&u`ТBf&CCz(jڲBô !ԄO'c,0.w0e/{Etxgٔhe)M.yj w,}yKfVs"T6>Ds[Th "C]kor+J7r`I6<>"aFbC#K#^ؖéfUH+,l|eeD#{&xAܢ>/;Olk9g/gYRrYL͉?{RCJ=[}gW=!w~5Er{polay~^ڭB^6G^6BaD{}^oVVغe3yz/}7-Ǣz;b5GVkO9ocA!ſiݿMG) 0_wdi6~2_H65e@ꤥI!hwnv16{}\Ze.κxveO55|Og܆Seu/ .+}7;h/N۰ʆc_h#5Fj6',N>޽Vv`%:' 22YHe_`}(c6Bc5&ka肘)&Ԟy6zx޺P2 PȦ.(VihL<ۮu\qdM-*Gԛl o\o duQ+݋')C~HѬTu9Ǟ6]tP x݁ISY !&OA򌰀abgd~`՜rͩ-!ZEr$ 24J#)c+U^p&Uoh>M9ZDY DPʫldgM4輖&l]8AhhRח’oo~DUˇ;F13 ;@c@ wPg$K!,*Xv("뻶6zZ+Ek.V_c%r)l,E`ۥCNZycRH(!Kr^𲬂}.łP gα<ng9Pz;PU wDt'ХCT '!iAUr"AK ^da>z\'%HIiFo8)%EPR几2l UbQvUGP^T"PJ0:&΢!&Q" .dx!2L$ٟ@Xx8`Y"I88qj:-3ұoH$( B)aH&p^zb*k(ʒA[d!s,i !f ,>]+q̹~2 M T4TE֛jVWIF%iR}1ri3/t+ r^Mg VZ1I(Ȳ\#/PJ=I_K`%1W%zOn~sFj ex}1N e1fC%Ж-M !`Fr@ֵokÁmhv|ÁBpszCZO߲Z:qDɕ(T2u5tPtI=Asj38P6s>.h).sY1dH"QZ-9Qv6`7QfLV`[&sv(A)#%ɒ QE2X"FWō%9l4C?cg64s1DdKjR'N%h C!k{*5UT%ұ-]M6]cáL1^itR^@j9vV'ތa)xv'8Jyrl0 )\N϶e*׷;'ɫ>Vܛܱ2_M-Vh6km g *6c U"RX$*EytA(LJ V6.IXEh "dX%5XH9`_Vif ͌#mmamɳ8^]Wdfts~3OW탿h;%:I@b`0Q֨rV@IMY;.:8KZcLSAV\=$%tJj6u vBQ&d\I䡱n&~bj7VU`)U,*:Ek%("sIꐝ^6F#%+ {Ȍ a/:Pɩ 9lFb f췇S[ `ΟZD""qg,cF@ #HHKY)% $ R0W,$DոNtP+:[e(KMިRLd]fآ3Ğ2^&)V8-w_ud8_.R,9.vQvq;2 d,;ZkrN0 Ќ =[îuJhOajv  L؞\"cyP2o 4x nd?>&Ckru0_*+ɋi:Tg^bq6ů|gu8K?Lg7|8NYVY׳8qt|%mP'UϚ{Vyb%sBx-}d֥g=#Κzfߋ7 ~዗K _ B!?g61coClzxba\{kQgaUy\o&..1Eow~ V/juw|a|L/EY_yRՒ_"X/˯_Mx6Nn׳"ٳG˳jË?d: h/>ڷWi[{٘҅Za5`RH37Rž>Eh~m),Uet쁲,&U'..Hk2"j)0 $$ʢcfK1rHS2[FYZ[[ \n; :jGzK?,ݲB$<9OJ3_swoUwn8:sy=:eb>=+p$^KK_$cC޻ P-Td%C%D袍+h>ÈT]!ѸFOgW&ⵜo&ubu{jN@ "OA%$MJ5SMO*Gp5J 2.jH1E֖RPP!De$r%YsAc)&t*ҚJ \_EyۻAMmR|h}v$ u>ų0 0 bFjx%޷ϲ/Ubv+o0V'5JegtYl-/I#1m|R{Ԉ%K{[cB$T]K GW%td!5>AY`Ƶ#[]lGB#Q?=xw Y'ua3,GOo (nu*Lm} %Pci6ʖe]I7v؁]۩F7FчC|#TG**M65%>ty0ۨz)/fSvx.n]IߺN3Su,Du%ZVR#7uyǧvK:HJ\Yq+jBQlITʙ e|yRUE?f%/Kͩ_Vf|y#Kؖ$wݖ_ TY!q|UR-[e Zh`*C <*+ʮjBmS!n$ޜ#oA/L>f*\BRd nT23:BNg@Bhϱ.@)=Lozz^M)Sf?׳C9KhyԮ?CxtGנ,}쏒:=[TlY..zǁIs Y+|>%|IsQ[=xz|wzqZ4#t} |ɓ^ g~ۿ~6 <6p&s16A(r0 R#%+Ndt%bוPRw7E]SR8dtŸM*Zu%e]MPW %] pH'd\!] -D+D̺HZEp!>//"d]MPWΣ}鹺b`6l2\̻+&z] % @`ҕtt%Syw%/jJﮦ%83q]:=Bkw%]}!RTwl3P\L c dL@WWcR:$JHW Lv(] .Tt%VǮ+*j2FVuz͌kf#{=dX]/3d3m9-.ΗWT||ozVazϜ- 7Y)Gm/!]1pH(\cSѕВ]WB{{u5]9-[$+~ JWJhGu}0euO!]1.dM -F] e7j: Ät`IwRC҅2Ū+g#Janj+LyTWhVjtJg]=uBb`l2ܱac%(mȺ!-̴5iŃzx̀r4hZp i%B5 -F%I''5Au.'ȔYSpJ)`c~ %4gO~Bb`MѕІu5A]ttӉ|*ZO)rYWԕ J$+:Jh1.z$u5A]9zF+Z'+5DW]WLi&+OV+ئ3Rp}HEWLkt#lu{L)b`ѕR2 u%! ~92{V9RV30c EGj5# D( ЕɺzjkUHHW ($+u>]ikJ( e]MPWtk}o݃KQ& AGsL0FvC,)S٬eH%{Bjz+HMgZR3drdS䘖bd'{M0z)] Tt%>Į+~\_e]MGWBJ;Nt%c/4Tؖaʺz] }B`NZL2(J(Ig]MPW :!] pHg㺱' ]WBI&jrdt%!it ]h;Sj~nhyx?Rl{N:YI'Qoв`BBqd}uc7-aqP: :@'+%u%!9Si/9+w%ćŮ+tYWSԕKHW ]:/7TtŴD%5u*rHIW ѕB2}^BKy YWԕ7 .z %kQ$dbוP:u5A]|Bb` `JWk\*Z )՗+ܳ񈯥 60(pid] #9Tdf]= ژt%^'+5J+56v] el㵳^EWƸFmŸaVp>sAe?d4͸0M33!vM3G5=EMz4Qo40D&9֠^Oe; SV򽵨ԠwTI_I6m?ϵ~R?|`?A`|2b2Ʌֹ3 3dt%+JǮ+tg]MPW4 Jg=$}-E] uekJGWbZ0TuH rJWJhm􃦄j!_.]1VɼjڱF[0j: Mt`)] TtŴ.&d]}1=կ;c=pi:ю5ag芲Zڢ&!]1pHHW;wh]WڒR!j2k"ޠʹc8 8BÃȨ7h=GqJ@1(7h ءJ&n\K BLqS7L2n}Bb`R$VFhZweo[ttN t%&2BJ(ͺylBb`?pRѕЎ5rϺLp 銁KzwWSԕ"^2 S2bV0] -D+m6\ի@ 銁Idt%8xF3(&t).A*Bۣ+*u]=ȏ38ﮆᎭA/򐮆QBdIYWOzTuAQ2\CJh]WBlѕBILY^Fq^2܀J{ *3v)4bP-3\ˑoW3HF.T6XbI7hxl{~Tq5ogP/ ~n4-sΟ7VK>|͛qJR>_YCh;BhuUxe}m E,V@Z=ݙ_ꯗ\h?7vRl[> >ayvG$)ٮ+q/w=S}yZϏɔT MeZv)AtU#5-=oXnK\e[4y}Q*v<) ٹv篛]۩F7DDF`nRiNUUmj0]}^iGƑz>e Y#_q[hyۜ ڴHcw߶b??W4Bjޜ]|wi*?:ٞ]lKqBxɪߊ;qS^3Bwr}J>߷,ۚ*ɮ۲7ӯ\}_yqZQ/t8MoOeйol;ֶLZFTT.K5Uyrow_½4:<~eH盯IۛŶ\7_=̾_\q ˳Ddv^3eg먷.$Z6)Γmqbɹg9tN.,0mܑyp|/Z-DWgٸŜvLIyGΌVCC\up%\tq5[ezmgʎKQ ?i77zoX@Qo}fW9yNrc9V}:O|o%AW'n?v6~nyq7/D^ Q]ʵE=-;D,{kU ҥj[]Xhj;o@ 5)+_==0^z(7g@ۣ4gd,vVO'm?Ϛvvv.πf]̮8y`'B#}gms44_ۺynq~ѯ^`&ov;՛Q$|EQF>To#.EG^}&=hZ>ls5N H4&H=$3&q-A*cG?)ryL&pfQy+z,rAT+JoTi`Uw?\W6"}]]"H7xy~vixj}X! 4UCv4fA~jv ,GKgV,dHB|Z.ajwqłmbZ 1٫WlB+lf6†,5'^fF܆P(`jhĿC=\ǫTEs¯$y/#o/𯊼̈́L b0x2vXƗwzj>ـ3j L`?_߅4 4k!D`D/yY{(5U-րczNYc1+X̑/}uWxxƂ.?rvvx<3J)Q*1~Cl| ԃvy×l/CsHM Meʷe`f.Y&\n =Zul6s͛mN`2|ni"ѠtQ-Դz/\ ۄݔbwzYuJ{auϝn*k}LF7S^?^pj+˶`sc1kK9BU uщ eR_@B1:zM+oo*W+ =!R*O^׭n lLs휍xo8H,Y VT+0%:䊝mh E1zyRZƍ6^_tg=太Jsjiu VM.lmڭJ [ ZXwZZV :'TWF/4"k=ϲ^hǕSɒbHPrM'4dj_A{uÜvTdi_3ͩ5{SJɔxR4PeINH^ IJGy."6FB)g(".r(C¨L ʐ 10L-p[KgID#_oI/N/#]d?r|K/r:%wIž+5 X*ĉBDL(+,~wrzt\^jQO|<]LyXTH,y4Uܫ4ᖣsV(ԈhQd"k({>V*n(osʺ0*$E&3&\ɝB,hiVuN{s!grM;II֤d"~ !Xge&'` ",8t+"uH$Ŕ5#?D $5#Ғm rQGa6'D;NdQlM:+2*dέI)>tik25c!┡nΊ{_Y~Eɹ$Dc)`"(£q&vh rJ(N&pQI'v6nx}2 7q^%o :H'&q *)3Kq57;W9G J7Xo3+.n׶{%3uq!c/i]S)9!Ÿw/̖J/6է ?\toy67~sk8Gki"t&r{ p뿾]]3wZwỳ߯-{W}]*íWo3׋we3(1A\Groe\΢,wRPM?iGՋj;o sK<(F"Gѳ%:%cP&&PD,5Jh!%5!DD} WV1%FFG4!<đQѺ i^/C) *kZM!tdٜؓd[Qs[v@nܺLzv$#B<mg0m':"Ӳ@K-L,sR '><g/Y IE'ce)zAEϨ3.2| EmL.i\J{xִKz<uЌp>}@?*O,"Z"E"0JuĆ pl{~ 5FY#琷ĐVN$&GU@+Hg^ L{tz|j1o9HZ, &%0 Q/*d.^RF]zV,QV`Y ^l  >]D/gP6ih&ko[<_ /ƒ㹕EQنE%}?ԇO|\Q_9``gIu?3f=7?|[굶EBkN- 8k4jA^3|C"t0g8+-ɇ#q-C ˽%՚܂+u8;~w҃㦏 ٧H~ҶTep6Z}2Q2X:m(ɓ|fb|dx% ZhJ^,LB3cc$SI8yW/΅5 MS*u&VlzK[ozEһ:  6;>R{HCDr!ù(^zRɵbv^(N7GxTt/Y ީq#&ȢY!HE\w ƀES^j8'is/9#c84PhwJvMMrK3 jь0 )CJPyb to vD#4yBRBp˅F 0PhP<-c z̏v%):۪AHƹl5(FU(!:|+¾`B{;PqUL-,\yEqs`vgmyg*wW@<.Ԫ ӓkUZ+srXµ>z|wAN^/{rVmWLbqO_oPP-p)9,N\ EWK1h\'E %9Mmn޵Rz K#mzjً+-$Sb-5 Njg&Eaapf I $v#67{Fm|.\ߜ0ų/gcd+C6`sM&4߲[%ͰWKiҤ`<5}d/(%CMS_řP\S>4ár-צ;gNebHQq ^k j դfrԒH">ےX:v!-2[4>Wmĺ$ 5EgTg b*=Aw g?\nf(#Q=#>~5Wr)y3C/9G2^G%0ñ&>h 4Ժ^J]IfشgLsLڰ4gܿ:/.qRsi_4Gxf}1HhdD",0 % Pl/%L(8H=a,3\\׏U3 ~ewog o(U[_Q4~74# Aʌ.WoeF5?#KiqF/8#M f=`8QHyyw=b<6(M!z?1+RqJTGd\{qiǥ&$5ڪ6ҶG9_qB 9JpCP^j0SWg3fK#=3 '.7UpVjI{q ҭ*LV|l4;mcp_i@{Pլ&xRSSBVw*g+i"ve|&9ߛ};T8_n6IftIaŸkbj͛F^Yr`"WWB|{˛\z[ =$;LsV19pۯw1Eo|(̣i5DfCgt[NMk]$S mQJ} NgstRQF&f ~gv|uqq{K3n|SlƞGf:j1{8K2˕$!LṞ0%h)$n}1Gd/bb NΫޚрZ%cQ`L)|m2\S Y3g qxXyf7\zIB>+ڢ-Թmt~su ^qlў3RToOv&ӗKrwfÑwoڒkqs^P4,oAZ j֥ AJA/ J,狛͠}><\~)҉9IqjU:+m'}ӱXr,D9BCnBUwz-6܇h1sVbiVliX]qvl/tцHb]j"7 mlYeTI}n&%Y"6JDx遇IJQ:{h/r_ sG6lj}!82Wbb|Z~wsiu=\]MgNbf/p=ds'/<9%n-7"$htO;' \O>$|=IjUtqFg!pW웣+R\MN:@+Jhֻjݘ3?? VO[)?+օ3ݫJ4-b*,e&`BS[SsCX;nds)jSEB`opdDJ4]µd:,r8u}Z9N+Ts5LNF?1#;^{DM 5Hdķ I%.&i[R}:Y4룶YGbo^qӿ\yap !+{$m@I]~5-ToRfRB8OMb-(EkNKrXJ4PLbQmFR1Z47yNb7)Z=د{RZ^+َv6'z {9_|jNZ%Z9.IZ(IIujUlEPXIbƌ، $ۀMk%ѱYuN1*۫g? Pflo:ܲ. M!z{{p6GQĆM@;Fa,[ ާ\aUB]ˡ➭doJ,V]HPwtՇz}ys}x;6Ywڲ:$hy ))qȶt8jmoΊUeTM)1ڜ띧8G]UUNzkV/F?hGR 2RXeSWCHqڨB/2'T[റ8TYaaOj3LUq5dU y[lJZrc$x䒳"ecV`V[Bcՠ͕Rn6*:˄@4.Z #e[݊ Y;d>jvU{,gPŽ<j(@B]Dvl҅{9@o]T?+1h aA HaG?),l~50,lGP.&se׋@gžu,Q +"- p@P L%BA1nVX+yh)(`~CoAAO9:CdW6!;"h5{(3um `yTVg*Bb}9q4BγQ8PdF  eH;ow޹5GrYte2B/]awi*TՈֈ-966-)êJI!rP>d80CJcE֊1C1vh4@yf)h_ݨ{ڥM N3 TНE"&Z1a~ {^m6%Wzh $Sm\@-$T>""P(\:|(ҏ;A<+f*Kj1՞ʂ0r8Gʓ#;//TH;=҃Z"Ę{!}Hs0HW ut\|π% Aڄ׹!k-QFtK<ފw pL~Xdu騑,©Q?F}g53x4d_G^6TFbvn{.ƿvvwE~+~1N!<;X:>bE\"A1aCtiSەp ‹9Hb2 JGTo7v4y B \IGuuZR"{4 yPt7r 5Cb=mƌPc8tF4ZY<΁c$dՖo;6@{?Ko HחXc;j; $  @ǷQ&ˣ!P>ѡa @1>yNb ~ޞi/Ϸs#g&m t@0!тf5TlB1o!߉R٭]0ZGÀVkm.֍ErZM1 }'u6BQÌc\6|@,DGj;tw#R txbLj<"6 ;p`@' #b\qD`RzɃC7ăGCQ'ysh= F.} sU"wȣ8)Տ)cã_u 5h~zژޣ2u1sR ClL)UÑ@XG~lѐlRd dBZ+ň Md1w*rq<ky-!/[OC2m OX"v7eEhw1ƋEVׅnCL pc8@{0GSQsc(]l 9x6mW F.ͅ%-UI,! /BFҁ!FR,:0J5"g-nBh!H|cs4`tՋ7Bs1nv՞)/ŏ4al CӚӄ%߼{eHHiCb?̵Fۛ<3?w'w@%wAns#wy@D@ktXT': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uhuN 9Ir%4 %q%r/ >hN N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@Z($EcbHqAmrt'I@+teG[uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R':r(('+X9N C'@P:JH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': "'íZ6'o.x)z~\/P: M~z $%iI%NEq)lKP`\E#D(W,YKRpj_:Xe WU7.q5I083&s§q\MSp&*)[6 \Aqr+Vq*W\WeWwU7?ͦn6xB)7mlnZwSoL8 tfh6o?n>l_iGܼ'4῍qgƞywhXݽ&O>xu~WPk?ޏ[@_?^h iQ1l/A/OM[ {/ÄN|f⫿o鸖;x.*sn&GoMLs*?yQ)mCM/&50ۡ |{3 7\ PKviq *$\ r RpjibAqF\c(fAbӀ](fՖ@eVqB\s&A(WѰE}b)VN9IUb6MI 6GW)ֈ'(IQ';2t\$fjY$Z! $je} Bn0Y X-UF>\'V}>5d?28Ip)sj\>͋Ij-͔ NSp'*+[R \YDE XmK\Nq*r.%Op˕]GW+։R(,^bŸ3l:-9Fc tL(6yKn] yb{yu^ ~WO^Ro+r~-Guory߽`{jMCz\}8\_?(W[Tg3A;םI]Me4 uoKlJۏ _)؎}uާlJZGg殖8x_MiЕft|pw]j9i!UrSPk<))0Cs⽠DCb9kD.?cj"DeVXpbprs+ϗUZZ!e$ZpłK+ ůʥR\ 9IJw1l+sH2h2F\upf1b$f ib(VLrQ 89%+W6.~UfZ#J>A@r 4,7b,W$gĪ/&2H$uf\MijLI*\ *V$\ApQ X.E)bs%Tfq佩u♥Scp?r?˔r{;|1'ޕ Ewq/~ 2ƒd7ܘ viTZc5nXa@UIp91b9Hk+Vjɕ >E0'WPk+VIIqB\!Ϥ 3r+V^+VYjX*fc%zȵF X-UZ!R \Aprpr羬sڲٻDBq\b+N9#ՆU&ֈ+ %H/]Y9MF ~/t:ʚU5f߯=QpIrji\MT+k*Sd \hf#WPMZ:Xʑ y8h3kς?-RI6,iqJ层}`:BLlQkVE&KrҤފ^1LhE;=;Kb29뜔LzLU 39o(W쩈ˍV XmY<9TW+/   Bn2bpj,WktF\%E,81bY̚-_]W \ r]O}q*u}*~nid&1S6UlpX OLe z7s28M\GLSuupWϭz\AXL\ڙ.2qީ8ۯmƺ4[ bxpr{;M&,C/=¤_ cZ>k<:*\/4fLR~oαBzmE?ljo!δ-K΅8j~6vI#j7ֆ2;ۓ*+ۓߞl`A<\}ؾ?ϯ|߷:gxu?6qDdwfk7_t|scN:w_6-wmsn,QjIi3hDf'~=+^u"fr5R&XKD`taQX+_},\A3A Xq*vUp5NRtDW,7d)L+ZBNT2\W}/8I*}& q#+&:1sW/W25*Œ ?lqrH \yKq*1pIY+Vq3x*ɇ, W|#1b$Ƽj[:XeԹWĪS"箦 q5Mټ5M\NS++R\=mdAb$WvIqB\95 Ygƞ\Arwb'Jv%N}˾hBf#.?nb}4s(4nXa@l pz1b jCKTFW+[w) g#g:'Wv㠧QqB\-IJYpqbpwuDf^qF\Œb Bpr m/6.~U. z\>6xQ!÷u nf4->d'럇j?x}Fʮt'kvd78툏Eˡ}񔕅ݺۓ.v_͟wG}=+C_w 5Q ?Uj}oWw6B6C<(')Ys] "rwi(_okE'{ݎvHM+tyOyoW1plkm;*̼b2m1ֱ=6MF(L?ȇ9<UBKjΘP wCWlW._]8WY]V)k?fQ\lb-]1RF=,b?>%{? ʺmnI芤MRWQ᪦UB[J(n9e4Slo@ qM5#GtZشnzk-k}>yJ_Ťl lؼ^Yz-о<[Mx]4Q\?58vSSbQ@̂"!Ƒad`zP aeު5uj;1/|Lo3TNY#3]ݴW~3Hzzq g-SZOfb> X|;sUQ3Es-k*ɠs7ixpׁ^,yIٷݻ=_n~xJCC3Qk!V羾H?%bA^6#޿^n-O.s +S Ft}{ӏ70WOoQ{θ҇M%  ʛ&XTa}굃7}i_L~u^P(M͋A $Di=YO#:kdž{SsQ~{ l3/;'l2njC {v2_7(,Y`T9F$㑅H.}0z)#"b1h#2&" T3I"}(ثoJ}c?>u tq( b;h֋\~vw>Ņ6._ )Anv.a8/gE+Tux&KϔךrٵxEC<~:@n/bM.PKOk>i Ĕ4@1(#1L:1KR;w @0yGR\7&վ=t}bfDϬG`&! 3VBP,خ &Vڗvi2+  LeD_khr%yWwSA޸\=Xy̑._,,CPiCtAB ͚r9gL)b-{5 zŹAn^Ps6%1 -uw&uK-ٺy+Ьrl;V/d\Yde DT6Je!٠LfL )&D^qZ~+z:vt= X1AJ !_t蚭k<+]XVSmT>)<! K> Jj .`k U cJ#68R$#Z{0+݄~gϥ9 >Ng9gWݸ^}n|opQ'_7+o{ɸp hTCW9i3QI>?/dt?[v[Yw*[$޴ ozgz7U%rOFWq&[FaQv(h#rD[1w/JEQBjYzoC&=vi}֘ΐ` `$nx&,U:+HF0*hX)8`1rX-"ro41{cFhNPƊXrgR}VbZJa~>B&/fS{> Zw@ǞLgjvYb,bZz'f;Zԭ =yzgҭeD:,WRg%niƷ'>|s~LT7wfӾr.fxx_G&Zʄ2a^!X*1j 3ꄷ뙤ݹ=6 [s1bNж&̙L+li{s_IX0ш294d@Ud5iNc]oOA'hH6a] O|JH$r[k0(xmoD:J^aSxSXaSVؼ5u⻮3s7 缧YK6ˣ#5(O 7`yxFb+RK#(N:F;y-Xhm 4h0H!撠UtT<+}Y8 W<|#vEkYɲ qűDgpmOZ'Uk9(86p)6>6aZ`B<Mn S띁/Oj1{-#cAR W+mD:S6aM:Xuh;C96}e~V2R6n~ d#_w; aWvӔ&OVTa8n> 3Jnx[5Xt}w(p6>'6 wG; o`ǢKcicN Qh;I財IER?*.u\۬tOOOgӏo)t4f^,ܴ^cMɋLMTr9mݣ!67yQ]pwR3y9A,?.s5A#eN? <0˽DGd ; Laʈ^2ƒ$m,I]cI.R2 FcȜc9ĉDcG" KQmn9&rR9"6ґaYqGkZ H*Ug :[b$$ G8K.=/"[yyby zsY|L3nPo̩lLW6,ڄVɺgl~+!Օ[ˋ"W\$ zٰg!Ltd㤒dwD\f)AqRQ!F򉶓< ]$V,HY<(` 9#8AiTԘ@2X19MEl^|{/>Z~/7EW&iʖq.ؽ~Sy }/ ny$Nܬ*jGPZHA3M4 H$8*XHǞ "pHnŜ(aظ^>窫w7AMUw0(iJWq- {MAt=t3mol~v&Wb ɰR^G_~"{oZBBe,YJL&0"~ nm:Ub=1o 1HqIQh)#j t 96 ȿVUGazPUk6Yy%p0\<|nL>~Og9E +EHVC-HLyPXFZ86Fmo%w޾C榵ɽ"L#Og=k2~ܒ]JF߶͙''o||Cw"s*&g4^+eR1EY"Y#ָΙ[HO5 Մ';=qn6tعjUD$l[uZ>UT4q(aJeƐQeS)ԊgCHn:BNN3ImP"p y4*΢F,,9А!`rW{Go$+t:)9}78̞{|7~qcpRn5T/wZ-ꖨ-:P_ĶE)1k:e6jO>$XA#Fa0~ izB^(Ú$E挑KrBJL5lq&[ O:?]{nf8rN1^%P ?YS4dWbxĺʉѫb O|mK 4f?h}oZXCY:iGPzPPnpI=K}F7/o,_G V t:lEֳEdSV؆iɚȯQ[T"pS6*z{N[Uov_z`=@`cM^T|M?׿9U27,sJˇ>V~t3n>Mɗ-Gw{mzy(}&v vHc-TZB'0'U e: ZУ娬Q2LLbs5[oFuz8.X}\/|RMh8P< t\%_RYMbUDԛ5f o~&1M<^ \K˼pqEIdSioѠ A7}*mh|"fgӓ~?qw7}j >VmrpMPa :W ~}a՞*U] ;pN'k3-Ǣi/֧Y̑/-& o؉i: ӓ?{\ot-oh 9Ԝi=LuO,bɽ5J~k<.'ƫ@O,Ң ϲtLmTgOqLyWn6& Gu)}.6ăNi}LƐ6S誾r%!EeEB( `CB^˻lrb$h}IIv !$_Bm_yzx- UBSP 9+/hbv Ed :/YFK?FT>5m֯"͍̽+m8~yPW)ao\jظy7/Ig:m3ݜj-j1"$i B*):&.#‡E݂I",Rib$/R eSd(tʚP\s~Uqjإ%:OS#\$;x{uYq^f ;U&sv:KnQI`pF5= ج6VaD6 +:rms#[)25YZ$m:bAkLY!,kD(`?=T UayQnB_Y7QP -Ve&3 j`t zP.(!L0l}cDvovO3g46-ArFD!zc Ph!yF@!B.feEv+"HU:FkJF!x%A Ah:#52*$ ğ+2lf|"VY>5L&fn(\B`Yk7oF-5t~^(hP?ģyB'15 R@T{'J{a6bov&x/f4ϫ|!2x&.&g%!nť q:hV&ʓ.8 Z[|1: i[_v͛U鳗6i.~$i*#W7v!|zpF;~]EѼy ?w~hx{~vxz`vC~JxRƟ#Ȯ%GFO9q=indw7^vv>U> L[4!^|:9X쎛ݫ{]dW]*VuVwRCj/T/]O#L˨}'FOqqQs'Gi|rl޿o_~8xÁݟ|-{?TDph~~^oߛrqZ9V8x*b Yh7hϜuǖ=Y gč3ZU*9YI5!N~}A8 s+ׅKB?$٬UY%(FВB51:'!5)L]( =Yb5=+Ho`(H/3!OU4u&h TC=$UvM˅s2VM]Kt嚡={GzfHݕ3OӀzOR{jU셗C4hnNOJi:P"ljE$d8L%q}f7W˽NئTm9nkDeCP t$g1[+0V\ɀ#EI'^Y(,0!D=|[\*mzir/)Sר;v'3nWbeF{O v?{ݿw=IGxQe s?FZFڮyWY-엹e&鷣[VajBeDFHj#W:TRNζ 9ZS 9X(D="$(ʳ1IŢT CR>dm-L5& ,α2F5T$7qO]B/V-y Н'I?f:M_J-f7Z:6 3>5ɬLA+L&0qy^2ID&0)͎əəLnGxJo=QOSM$MΪi^ˬ p'`pt+Y^GP$}~V:2R#MJ IURΦ~6dZȄ+ѸT RZFQc?}sڲZO.c@_eA p[ͮK# @ 8ǽ:a'k\98&:jw g=jHzT%D${i\sjNix=뙒Z[[ 'Qa0d YiRװ)^,@ J%J{nYz鲭t{]t\q9soXHճ&ΆzsgɷU;&j-5#plKd-I)4~8>AVPBLك<ѪAT/e/ vGd<Ƥ@A7P^:HÁ!PP+ apP {L$E4 *q "& QFWٌh0pmN `=Mj+RZ|I joMru*%^I)3fSܠmeDXJU6EUkJmn0[REi@ MA BfRF |pTEUR8V b|b[=VX'͐m5&Ags8MIhh@"喀L8 &-:Y\p )"7o4lcs)1JRI!)J$e+Y^$Nd*F'At4B:LNe0lk)h LZ &9@x8#,(a$Eqjo.OG㴫5=BxCAQrB *I$Jy)5oEeT=<^7mdДҔ78IJ*d"&!2Z]**HKXx+)ڢ^gw&eh]L•*-J[k/AN:[B( ($aCC))z7°ѪQ`1ƬP29 i)r5y tltl`+|l:-֪ɠ!<@xޟhlIq3?7k }9}AhA&R%s$(i[Q{|<;sbOy%tTw1@bIE t2:QA+PZE !IJ9 Z6:5ؿ{;! +U[UaΦni{Kq2`Ò#ɊLRҩǑ 3E)׾Ey[$hJ Ǝf 6w˜}Rrj;Ք)Y񖚔I9*?A`u4s.BZjoix.w-ol8T)@ 48DMZRL=T.7 K=ǫq4ӓsUT\u(eW_r d.:9 Sl7i#Tdڛ/s*CװHKڳ<ɬ EeԲf6 Lɻ_l!NMON.9m(dr`dX%PH#. 44@[--<-\JT6T67G<5I?ЅWnK4ҒDOD-DYJ`\4׫iZlpPxJ)1*Pl$z y?Zs|n n^`:&"BNo~LgCSٱiFkO>odϱN._\MwYv  N6Qt>GmDcLpQ.)TN8! B jt:[jtn]s:䋞#z̎Pt),U-u f䜆kFoUxK. z)xÁ- Z.'Љ8Q{ /3DOHt)Rcr4w mh6X6^;?^敤mޗ1jb<My\3?@(?7}nm~l~|oqQWo;RV6@W7 |VAio,FT|;۾'}8ȱ"1ח7iE~~cxRW3Uyo*c@=q6(b%Yf1uۯ P^ESжڪ]=Xl0;NNx?\?P?Fֿ5RX(Y } PJ L މX!!9Z-e)y0e8F y Q%p vXjgΥ6x_$<^:ꢒ9o\I X J+k)XZf&Ezчy!2:(^#Qt2SeH"n&ӋxMXO;LY_oIw^?y|<:g~eய]4J,6~ ͸ÔnT6A3U/l.:h6hꑘM/'G vՊ\)Gt`AJAIl5d,PŨTM^;0mO$/u.oxdȀeTHr̜钁$Ѻd3q^qnneoέ]_?uxLë1|%f5o7^nRRwcV@gb) ŚG"^]1~-3 xsc1Z'RaH)Z ZL 6[]bN=QtxHl|gYۜCF儋U"_^KUEB! ^^GcYȺ8{rSXэOL5֏Y$n_\Mg1k1Hˤirvr~rw}t|⎧/ts꜈(pE.A?ܯ,)6cl3bmGTV:50B (tq&'?=~>9|]hXJ~;wUq,㼻W )z!c~;*aþƼ]?x _[MW4x_h=1h<5ncus7K35bɊN K9 8 $C.ɩS )zLB!`J9Њ42T![m!Z*R-:Ihk9_P DQz\**F5V@i}8OBwk&jrtՅڌ1 %Cõu1@WgfiΦ@\:TiWJh*C8=~i']f3Λ^eVLt90Y; &~`)h_S 9X0>k)`25*SIb*Ba,pvNf_zG* RTKRH:Y("1Y4@DzM!M91=*\[G( *> @f>|/w29(ɋ#i \(BݔW;-CM8,&w~~eYU[Jy¥?1xլص:o^H G7t.|C>cq*h Z_DMP()D;eٺ"-?u}J[м컚mQY[;Nb+v+!-?W6WN]pJtE I%g`G],~N}#[ےd儺"#4/#D)cȜ G)oJ'qSr>n_,b1naogfš˗LH9GA\5&ZdMF ^`%CB""T]~܅a0 ax9E *P[o4zM2I'deN 'σR.K2CȲIr)`ϼ^!> #tGzWmxLvj[6<&m3:޵q$=dGCq:upI.wЏjԉeUϐDQ#H$j]_WԹhlFM<$XA#FaRO?gzz!g$g y sTf 0>IKrBJ?Llx:{28CVy% y*9 ਮLzA+c%g5*fC,)ȕU1%d47 LYkxvSáC on'p9rs"{╁nv7L?s19X-lIZ"Bɢ 5P[T"pS6{{O[%:h_tҗ`=@`cM2^T>~j.ѿ{?N򥵯_wShd{\7,G{ml\oI@5|7zL,uN:b>N*Bwo&JI7l 8a ΛA@ ~za㕳Uw:wy#וN fZ Evږ7.}yxxotfNO׼nB6JeG' ,Eiځ\$sWs{O{.OXw֪>1I"k&hRYvmf;YL2+}.6,'k߰ʊc_#1DjԬH!߭x6iU څ;/v5GONF?hJB'Qgϱ?dt6ZL#%h&8DNfH)OF?aR[熵ܝT[Ѳ{_l֜߮BQ2KDF͎>]C~72$L,2jgV=!;(%T ǣ=:4|!P D ̤/M;Pnbgcgw|k? Qiב$(x70kho1rz]O=E}>qo˕:T-)no&,Udb-:;Bu 5(/2wVdzQwqCJnEKU:έq]݀-zx)lހsՖx$vPg# \)&+ YՈȾFACLEQ " >ҧ6,D^rpXyGeM4PyE5&)R yE293CYoSty̞Nz[[ 2Umwvvnε[̪<kOۙnfvMPY6=5&Fm0xTي/R,-!S(ɒRΖGRha`d]c(c?R(ev&d@ ΒV@JJG(BAKz.9 Pk (l^ Eap93V̳g鍜2)MfV m)5V_H`% efp ,1s2gSfc}C EzIrA"8T^EQSֈ` :4#pAf̿^̿aiKL y_o dI8ghOss;$O:yy5:g~7zk6! 怺/h|Ej/~+oyxzz8->9?L~Gp~GQk]|]Լ(mDgB2mA{<"|70Rhe"߻CMַn.|ʆEs~.~٪QΏGǓ;:iowO'N?|;2wCF^My\fj] {tq\s/Z>-VŞFuQaW}L)ƦHVoQ79o7ٸ@f7W˽ڦTLsktTl!j&:)l\je$I '`b+Z(l4N!1XJ&aE@ g-6dZ Jل=J4"Yx/6:S;9G߫ ݡ 1l~2_nf=be[U&;`U>Yウ5PK=޳ J2I sjNC,جG=2;%*5a0d YiRm-"eȲW> PRIA TFi,Z^:5LUWu0=5-C|9ږ8X`&YJs෉Cd :aN ,`Be R0D>~..7}<&1&s:htl?+f.a_S!dl| VoV:;O`U&fE4!TbCREΐ5f2z?S{[9&fo+RLX|Ixf)@&d-!+ T0 _Dj bjju~ѳ]aޖȶsɅM1z| Z+PDSP(*"U*`K+_6{,&MO.v;z[!r;=vLEMovMɲg5$*eH&6шN?Mͼ5{<9d|1dK*IaHJo'(;h" )%gYb1PZB >LG#9m ~ A H`$<^YLY?OH<a,Evjoo.,LiWn/<x-79 ;(TJY$5s`O; 2pqJt7Nzg%l"&!b?%PL Qm5{nPu&.&JH57mJ[ ׹Rjiyळ%RÇY=OSR5_jAYB&aXhU2Q`I# (TvuBxlxlHYgZf=4)h4Wrg GM.1#GV`S""5:()AIW08Tz✸CqN)".6]Lc2dYb t2:QA+PZEbY@dRn O*bd{G,E-B1-9P9w1猽R`W!1XSHJM$|)PHk/_@dԮ9.UQWe{) s@8şQO,8dXNgj{ȑ_17/{nqv $L:Xi7ctD94ʴ^NZO~}Y#drD>tB?f~';-m@n#eְCGMLǗyAs$@߽m%C dI0h転thPV}:Mt C6`ѻ4&bg@2 (N"jrd彳j}VkWWt}=['-}0ȱ"omztXϧ!b A=Pbs]o/דOIëxcL]!\R&2ؙGdokoӌƎ:;IL<kaZ%*L.0YkY$]M.,#I+ZȨ4R,HxT"ѥuf+q11`1^}!F8#Uq:*C:R! ::5+h;n6p/LbHOqVu!2̶3Ġ0"1䝐Csf ɻ|k9 TF: ՞:'"vy]vvIνX=uO fx>O a<:'O~ ^S)9؇AX*_eC(a_q»p=s9S.źs=3 X f=$,ĸXH srݖߎfdèÇL=1^Y l>f>|[c)F%䞷%&dmv(ڴFm7PiGקwn >'ݏxnl\0jwOyOғz#6xyǼ͚ߪ]’zăga Fy\ qMr7{yΰ,wi!,E!;z?i{!OV6Mg)m>RoHVt((t g!MjeԘo۷ٜؼm;؁""? o>m}AKcj=!@Hm TZDLْ5md5TZcH k_@Ԧ|  ږݢխg2.(w<l6Sw}ufLmD猖]-o`a(y*]*Q*G5+%c@.B #buY"E)LkSCD;zq :%uZެV~Tz[|=$i3絗F,*ޛ,)ko J+k_PSS%6l.$)=8%"ie0(֑(m2,8O+꺼2r+P0eX4{'BFkN?FݢIUfDN2gve/]8X ,6ﯦL SE6A3U/lh6 &S=Syx_k>RG$㰥6EO}^^uS8ڛJDbM-mɻ5 m! 1WQaCt(1NOQuڒr>tx2 [Gtmy! P?[GW@yF#5[N(;Et'2#+0D _CױF댕ɊD#A,&"kɇlb2l9;TJ o_>_jrq;ÞUwxI"_a -D]~(@{ĩ_N??[AE.#00 P% jV:e!dBRI6ytڴm] d/>Ȑ5=2 %9Ik%+UI"4f<+qW5A-_;Y7w,ѳW 򅘥oo,{|݅fUQ-KJ3JkPJFUT÷v囨RARVR.>z KJ52i}mD62 0h@ FZFjY2$ 'ttE`֘ eB^4ْYӯyۘ$($YZYVeAڶL0X&fz2S!| aLtEʢ18"%LXg[ףj&Η*HQ嘔,bJF#y(X` z5j0 |u 2W5"qaQi&wA>`((2 *C1.h0B&@qj&zRH Ss5~r-c+}5q>~}{fp̽c ;/.OEa+cv]@t\]EɢL1vEeՁy%T˾u|9T,RQ_psKp]%r QY4S Db㳐)EzOF8Ҕ 1 QZ0!ѩPr]gmWmwno2wi|sIAr\ >}|G}9ݺ/~dOGΖ͓9}ݫ?*Zٗ?&we{vًZVlKR )m g/bv1tͨkqi1@S] 'hz`ӑG=C֥3JdjJSڳg!Cu #Y‏1ŕ|kqݔWU7ʞ|;LI\b3$ZZ 煠?Ngl(a<(.c^d75?q?i;w{Y/zɌ,Yf)KуWofwsnyL{+k4t~2Z]Mݴ|p^{6,Zݍo̯xybŋZ\=/wo&h|o?l{G4TD=:ɼ}W{BԫdoWqcjK{O 8[.:9a\9.j:#YG8>>gq6TC"Ēj."1٢3E$\PI)>q7v[kW&oQ_8?jwM||b3JGE(J h Ţ/,ySǚ~;'_2웏Y[S5-W36O;aPҙ |;c^y{foU; X_Do:tȆBI؁|=+ݷkkj,Go]R臚螉y $e m :U8DE>BsSfS%}6}OsNJʿ/U;KJb61z?Z!+1,REo\Nْ'Uu|v\{;n;+$/pg[2^@4Tvh\i$.dYŊ Lx гzz|Ж5IiP!EdKR)k*fN dqW]';{|/T\sZ/_SlYhE _Rb{z3-<҃k}34{~jZmC56 (g7J?BFR  mwҸ;wA5%?=tjpTt=c~N:3ϏppҺʯx4懝84Ӫa ,qqW }ygܜ۽b5kTW RiJ8|6륿2,4#}w8An=89ۄ8q7t<-x-o'W&KΖ`7?Yt.ObeWS.;qg7~squoQu?˛rpխ۰ 7q V0 kP ozDê=(53|LF`٪M:zbyxV ?`>R9XK/tylsmXeɱBd]#5g/<۔jRĤhrl|F1ٰ/R2&JM|g !1m5u(Hr-Sމ t&h1@EXF}&6Wg`..Թ/%͍K*-njQRI=[Cpjk8J]{d:m|}wֈ=}_{OC CPd˨j$˫ToƮެ$P(ѐlv$Hii]f!f ŷz7hWCd?e?: TQlL|3+guT[c!Ï毦+ " 2֣:̛?H= > T%j4YC 0SF}D HϮ@ %E]ZKtJi|}0d<=qί-{~ox^YJDɸ, :܈FUNeGͳ,VcX]Jzp;Zto:zpiD&84߻1e ʨ_x8.nͨzg ܇'w4o=55 Q8]wwˆ=azo9&e<:>=YnRpd09pu٥|Ǔюi-j$+GHQ j0S'YOg3Rn]&Ńގ9:] VQ7jU*=hs3괆i!u`/gv]gePAP,ߓԅ5Zu6.޿w?=?IO߽.)䴫D"$|14[ =l3oyy\#d[^1}1vw(,iaoo'4Wfb-k$ -^9~Vsl4y=Ts#U>BP7 78:6^ ߌ.J"]a6x_#ɽ-UH6J_E JF"\5a LD3e&$tBG ;} @sQ$PDhU75E1)hR`.8t^#r{:O+Q9#H$y;0uggi;®ܹJyi;_A8x)(qcY!^a&2솎J='4#<+G!$˫ ^T%=]tڰng]9#lTdɃDBE A,v*"v!A(H02Le1 J66[Wd{Jp<( bԘz) `gPf Y IR*h2R;}ה=Ҕ)` f%d)8?JD,6ǚ*1>{BFL"S\cRIՅV})H)ʻc 0L8rgWO9IZ^_@jH mx :G:vA- t/=u\<6 -Á|Bh_Ij̗g߆gq8ᜏGc $Z'ÄQ KԊ%t_:հwqNGqN(Pjk- 4:eu,"ReVGr$lDĦG)™ D@9p%E\lFXR :K64gKcs-dS&}gqiؿ2):[H񒚕sYyH +w_<{;mܫ ݥe (!)^ۚ86{6L +j3q6+*_ԉwf)[i4'x~/4fd\3aOrͰsݱ5W8˯o}T]>y*ܘ&˃3]LY_)Ima `WM`.2EFzZ$ F6.Hj+X o"/8,!)J64 Tkdl&lfXlf숅ʄ0Xo2_^UxE.'z/`_ǣW%YPBhS`-nj"R[*&>j5ic)bk +=Ug/:%UMm]u}-S]aNP`K"8;Mpn6;wOۣk`(JE(MD)֚R!z)b.Oٱ/xwg fں5rԳ{ Khq \~BG湙YW*,6: H Xz4T qs~H>)vUsɨd1'IP[2Ny% )؜&EkIwM\9Im4lIֳiE!1!RNg钁:-ٲu08[Cs(Es-<Cf화'gy1F[*WlqEQ0dg}J! W<:jyH d FxfYߵ5٪}b Kj43zBxETLEZeQd#S6(kxi,JEl]E)lm7)/*(OVr]'rZ&BogR.jHf}9QM.ΡR04CJx9a ^K a.H BIVt-U2@N餷L`<W4BB=e(_dpHl>[+`3=K(B*T6G;\϶9N~0Oƣ4[nO~_u ~k@{B4@fC==[vD'TC/Q/9"A,֋P*HWVN.8 "J/QOLWϑ1㰯6mE<a>iS0_xCM\{_.Y|[]K-$2 h~SE\tP߅7/ ̛?:ZSt~0HC59D)[({zrKF[S؎[⇫܍`V 7l}b1AAڕZZFpo0-G7xa]nd} yҕ v|G"E)]XZ &?qCwA(SG\%c^p%LO"!3ӏO`d哺_Ѹtn}bGԎj6{^m[J!/F꺇;¸&NG*88O@EDr4{?[& â{̔  b\0s#ie/ɱ%eB%ilMk,]ilw!IWd&Ar廕G$F|]}; h5[Fڨ`2\P*ldhm](SZ*۔U֧*Uqa{Q]*C S-BhwBCzӵ+ߺA(c ٤RY,iɠC$ZC{`ܜh8Zڠc澁\>|<ޭ-5+|BJMU6@'{ym&5!z-pO!nPd0vD4Ccv̠o茣kO1heN h^{["Z"M4DehMz݁6l#MBM":JJ1T`&rih*S=^bx6;[_:GBFD^;M6hHv~zJ"Td1'=:fDyKƜ 9X\> ʻ$IH/Z)٫uoωTRmu ;Qo^ܷJ0'(mGZdjڤ@8:BOk0u€"ڈ*`8lZOI=k^,>YЇF.\>E=uk,R`榤gEmtAڳRȮB@&Qav.5ˎ0H%GxH: rNg'`b |ZC:mZqLJ$[PyU벫ԕ6kYp M' ]S`1G:+̱`HJ\l=TU!Rcfr_̨l`WP&4$:\id, |Der HM$P **ӉPu 1!d2 Vj=-V(!Ȯ܁pYV ++5 e2a̷6Q9j ȄكHc7uơDy>UFݚ yt_V 5ȤuՅ:@vdm^ ,:?jPJ4U;+re6 S`U0Mv>1~ޭKӼ ȓd}E M0)5VxIA!%DB>h AJ./sgcm5(ڽ,C)9kD*ZuIBv 䁀:赈 )-f)D[vVh{Nw=/ (D tdΌ (NEQYIb&#Et4$?<(Bjw7qVUp*yXT**BȲ$P>flq)XNX)j^ 4n+=liVԀʬ$޲[6RSң*x/GT^I Z6H*tM>ߦ%L0AlmRjh\1ڹ;y6߭l4A( = `-yÖ Vc7;"(ڧHdviP'7 d!g0Eʟ&oQ^1-j`R'JQYTPcD&7cQGQ1abRuN` t@ QY:EUGтΨ19MAk.YiFO i/YTAlRԞK3e2 b2ٷV :Ok=DT赫>@MVkA֫ѫAiY'X©rlhjӨ/zBL iq#f0Q%{suStE8 8%-Cɵ+F(ЭvF< \T"N6fTS.\oH1ˡ옅jIr)I4M](IWKl$dip/(m*?!t+<!G n-~}חkqz}~es!l~6*|-̷.ys:톂0HC1EtOBz۟ji˸6vz,/= o|E3wQb0lx@hߟPNg@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; Nh3ȸ'a)N BO0zPNN+N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'uY"[8n\hyN ;Ȇx`b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v}N 0r9W8V?}'PN@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; NzpŭKeu愖R݂?vf~UOaA%K.ǸD~1ƥ sO޸Pi6.=gCoN/P7 iCWW٥tҕ ފ%Aҋ+K7R 9ꡧNW@`zt墈6.\Bu) m|X%cztqIt壑n9t* (SLWϐBBKbp^a@E|tE(*qK!4_*^pU_6R U]Fi ">.nK+j~(s+%bAtE] ]\oBW6ʧNW@`zt2Ϗiܵ}I'5ctMKOI}.qc>AW/Vףd9?*.^+~/x·kTQr_6Y`^4l`6ӋqxW߻wjQX`i ڳ g12+ c3eu999#ҁV@|\?_GqeVZ}(/z߿:[o/Ǡg 5a%A|tE_ۍL ;҅~7 j?֐_nQ{$߼yP}bZ膘jhkQnj՚Gv}GfhISDq\LFeF9x/;~c;Fn,v)o vbYLct0`Y'dww{oɖudEvdy(UYbF^HBጒEe-*:'9Q& Y6*K&GFekeǘ ' @tMKW &](娫!JRf+66]\tQr;j`tE2ES#ZRUC9j2i*DWD+y"JF] PWkբ)6pEWHI]WDESCԕV+nuE24DkACn\Yns ܬ=9B /[}Qr\]QWV=aHWlY>"\.sѾu+t F] PW\HۚhWc/zyVQ]^%P 9R(e` ), (!AJU\yFTN$7{609cmh]j[+#2 D˓Rqի칺"`ZV3JJpe+)9|Fed+(u5@]))vNB`-l6"ܾOjxy\v{7\'Z=Q¨!(Y #$+IM{7u]ejF]Z ltE6Hk"1ٌtY@+N<:G]+zΦLY:˞7v(iu?t4im/l]m^;ꐪʹIWle+<](u5@]%r@~G|\?*e㦓 o߱v&~ O|tuNpNMVAkd>sS3Z7qLlWzdž.ZQ&0u,LhrљaVe+v EWDt꺢dJjXN"`OtE6hJ]WH LaVtNltE=ّ$ A`JtL6"WtEZ+0j2FYP;;)lMM~8>D]YxS,2Wd$h5O]WD9FWԕ HW,u>3Ԏ%?N|zt{V=LÅ;:򞣫n)tԉ%AW0Ъk4 ltE}/Ft"JF] PW\iL'|B;M>OS4ܤ|lXS-h:Sjv -d7 6Lq%n ZgS2ZƸu.#]!1pEWHk+lu%f+ 2 VuEvu3Nd+ZQ2H:hI]WHZw|u5]&_s pEWDkeBJQW+ˌv9 #Q.]&DZnF] PWK%pE6 +bW+g)p߇ wuٍvu;芏:sdFB`M6"\hm.7Ku5]q 5@6"\sJRQWCԕPy7:KQCAFo;h9S%&J@N. ~Y&Z&ѱg]/m&O.fJ9h [p(#`k}^ z&GJ39)HW_DcpEWD''JG] PWJh`9銀-FW˹EWDҌu%e+*UkY4uE|\45D]HWLv4k."ZQ쨫Jݾ{|"`Ϣ)+RQWԕS DNcW3ZI"hJ]WDi娫FWbϪSqU}/dyRh_]Fؕ+1Ъet[0VuEJe"\ݔx@KtiִNcNn6 Lک.svbkZd6 ;(](0blLtj6-5 LFvv22.S22t;rpO 0HWY6"\>ZRQ 6jRi2Rag+uٌ#|]0+\] q \tER+rued+v`J$ 3G] QWqk_ڹZ4ɥ_Y |d}훓k<O.aur7we;'WAͥ?R[[̘[Ίc\YYvr><.O\co|6_ɿMOYh~AO%2{O@6Ռٯ=AjJGPVy VhYɣX+R.@)j(1;}N{<>tf$b3-LK]don%sI45M&Wf6/.|u*6ݺ'g/M_. ,E<[slqJ{ǻwWk_摫CСQNW5+JY:8ˀ^sw5江ɳoHon-?qybEh4iVڔw1;a҇*t]T/ܴQb XNNhe@L壣5 e5&T18RW+7I*/m^4[g%ԯbɢ,)Xz4"h-$d`}-Br]w{+"tŬ5saZcco},nZm^ mEt977o1{M]>mȯWxs~+:0cg? _T֋>rAN_n_ OA)p=~UR]~=7; ^9QE"ȲM-q4f\dəa,UqAW;Q: &WqM>TÔlo_zm"~b I}>OEV +a]@qgͳe 4/*wӻ7q/n_|OUu\0??,},O&M lqJ}ӧ~lMt^_r3e̜1[`+iqyVl/M>(f&XE}&, ԏq=ƸKؔfz@l`ff~:<鲠 GOn+em=|P ~]GH*JY2x)*h}QjUӮAw~/ qyd91#Slg,gptbo[B=(C<䋞t=dOKUPUAK'Ɔȍ d1FVqWA~H)B齰b.Y\de4ߪҹ B%T i}i#GJGrPH )\\BAUeQ:~ʗ$#bZO8n}q}EUUu[c;RUyVjU,8?p;cG͎/^訧krfN3UL}"Oej1Sů2UlH/I>Ǥ9I雦t hjћ0a{#o; 䆭"_=(}J*xeLc&wE t]V:Z uؙQvL4a6w>Ockg8硹qr5Lޟd=j=fDy:7ȍqwUno|~c?Nl'2ܻG)^^qw;|w_e]>lRX/=T|IF7M NHiOE =[O|.翗G朷x(Y" Te#,^s[qy=s/K~ ү~Zdu'/*VW^\KkgJSyK9i8'ԥUYZKLQ'/\ `m -!B*J.TATI,-#O^ gɋp؜vUkZZO> 그 WӰ{'>\hK.KxUbg4{? fW{W/GcYAx0(K(E Ȩ^ֲؗ cK;=hʋ< %jaUP%dAKW3V`@|5VT#ܜ.r9V~)-yNr\.Ϯo {k\ngB֧Kjkk qrVWeOeZs|d}( &ʹBcWBҖRKevT2Y^ s,%p**ULIC5&OLh_%1rLsLj0o2|541e1 (C5 D TMΣsĕ,eU);[̶t!6NTx ) O==bs=1*o\ʳy䇫&򫱷yqyZbj֘^'eu{#I+mqدꇁŝϛl$6~Z!#RrW=÷8$% %J,No22MϛS?7|ӊ?۾&")S[ M؝(Dg1: r>I$ۿ{ qI<ͦh{?Γ  >Um=TK(gpi57dQJ!C/o!]ͣ "`.E%Ϧ#3dS""!Ayuà*.Q33NC(w3]vZ3=&ZUo!oOb}v;#T1rz}7 <;|`.c{C5ݎds$h4_?źi8uLU+E\%Z^Q)];XȝM94KWmvkvw{Ql%uϭq:ɞNm9Uzx^O̙j>!r_]!k Et0 ڋ$%ZA/)r#^Xa:dE|JLH\1Fhf("q&2G bZAoۋlMJ@]Û "bހ2gXiߨ+ -ZĦ]L^~ީ&R9-B+#űg3g4V>-/57P3&[" 8I?]2{se9ZV?nP?8u""I2AI߈3d(;٘}TsLޝMhk7ZOHt6#(O5, w8o1}P,Y|s 1,*zc[E _Տ o>{3D/ 4|g̮GzfZ}ŷa,,iIƑliѓ嘃t8*#GdӨMJu\Cѵ]LbozhC_TJ\;/q9 ŷߟ_o?۟.(O߿O8s&4>ƹgu@υ{f]P&[ܑ{a>Lz~E%wR1pg! <@LvT< Axh751S%N&[ۢJ#IKuJ2pΙa&26S΁oq>Z@4fVSE|6M^QiϗS 8 3q~xl~I`6P큳Qc:tI!xB~ LieTP]&bwʙ$NoO\侇Z[kqrp$CAxhcOvN)"w 7n: e2,Cqh;ct%O*;- (/D`b 5 Vkx^x%WA>A` d2Eϒ99("\TFȥd;gaMSӉ$T–kяSKR V3#Lf:!#v-NkꍲVG!Hη-!46t4q:1Wu$P(,B "P NRS$%&EFʵ_crb` H.XJ;e9\:ޱ^@iN;֗@Pc&a1XG4bڶ/¦Odl?ʠ4a2Uiż/~Fbzo$Wv7=T> "˾ܳͳ#? Y&,|e  jDmFJ;HBe ՜yM :İ69X,ּY_'=qt)jңdԓ[VqC={: 漖sMU9q$Wgnf`>~抮iEȝLGq:9p7zjՀVw9yyV2v zQ{7z!:ƈ^=X>oqTYFN(̀yFޑ}gqUk0LU.ҬMqfgrdN -̍[h֭;6+'{\os;/vj}Y#A0@Z_ mn8OI֚鞗U\D$rL<ӄ ȩYl$ *٤v_93/YՄV)ZzsmZx߬;}K;IsG`ZٝM^U~vn*je}-sRxCN`tP*c'Sq% և:'S^M>N:F3m"* ѳcI@3pQA63e(OJ:L6CT܉X&H^eEt O8@Ox鏬])#sJYk$oi-Y_y*_s-s4I;W|*u@E.b@R+aƲJShԫn:q=ۜx7-V @fPȝ5gLM2'u*80t4[[HiQzIAb|TP^]m_S0L)xj KZ4FZي!a#A+<6`lWob4P#<"ͪ9/nZ=2v@r7Dk;6$&ل{+W{ 1( F Q|~ژA%[{{mIA=dy-)Mfj@$f=Z>xm"9i C[mͪF`4iIy]1\Ʀ卼E;/++PCJ#r 8-d pƩP0&dcW%US蓶Z }wE[w8QʁZi O?:[ǷAW.C]M y0iHP#BidR_\Ni;9+*Zeӹbʐ c,28ϲ3)ACIlUۉK8[%ҡNFcSx-`|FXS Mw.tg`^ȞsxgSI9Tĥf>9MD@D'O$T]WC*X@ J d1".YN+ʝ 8{tjw%aD2tzUɡRD&H.(ٌb}|O(=";8i޲ڪ*]QꚱuA@} dfUGVL΁2;ST:>E6 Add=אcaD_i &VM=ދvUaa78 Qu0c{|;Ew,Y'>\\Y7}/.?^?qĮi.r\0B)*%RثCll5tEl$b%{AaJ`o3!) L(&KUSg&ÈbNiDPPK˥B&Q6bc )n3j `pe4u>Jd Y"Cm$X` $p 1؞ʒT{@&x8& '~3"gDw^)kHʧDQ sR^EF}R:s0|k.֠y Ԇ3ILLqE,d`ra> ~mcMpsG~Dv?n#m7u?Tk`YO˫K4~^y~B)zSV˦Wo)m޻/bպ nq|՗_'}7z;NJVU%s}{E,l ~&m?}ggO6M?[8 Z"pJ!QnyǮM-OD6h#6lɏo׳ŋ f>͞F^`{k\\]L.ŗ5GH{nn7˛WׂWMg:r\_YǾXKN/E7ܨ]NfD,wnofzQr_WW7(~ oW.mF5*-_edW^d;'^ɿ6ۛq%UD>/x~Zch?@ Q%Zd lp2THklR? jGAwrOh7UƼ;͘ڬSg jy>nJ>W6а.?s={~F&p8qQS&DaIpGƘ$ap%&;L&U6)U>F5c dJEOzE&%c+5 U7L2(Z|tդ4zGWjVNV \S&U`WNb=vgmժ>k3Ͼ{uՒW;}חs[._o5g#Ѡ`C$PC Lso=rG蹓Spu/hy_a?7,~՟v#lVǷ>ld+!&)XL\!Fb )kNQ37_\?/noyggx}{r5`9\/ulu1VYn~X.RB,G/G&XL >xIh_!K%z*.j1~G;GeѝFo{]>MgBWq]aY¨bӗ.ݞW.>m^&yԾtґvKDaUfȆ`@& *U(e f{8'4h:9Qd!9NL%Ī[MaFhQ)8?. TDEyF93ό￁ĨMI!JJ,+)Pgeˡ X bI4,v~D)Y~ [XxD(fɬMՊ[T*rJd2 8jD.(-)LN핸e@}殫*%EyT κ7Omkv( Q2{piICAĝhBUQ6[AI[O8@Ox鏬])#sJYk$oi%,<e|LJ9j9OH!v b<Ćiktr%ĤJ`XQ) ]cUo^MeO78pφXC@wC#}ܴd[)TJtB͠T;kbq*e/N@)2Tlq)YLl7WE~AOCj|ӓ) TA%WQche+H1 fN\fgؤ؆iȡљmHXY5ǚa08Rwo6=dZRt1 F6ښ0sKJ'疔;cd Dbvz6TYt4̈́ڜ`5ҷښU VYy\QCЇ$`f-|)D,3)B, sM=.nV-MsU.HM ݆Pe,-nfk V*YrPE~CL*kJ5c%)GWNgU]6e촬ǐ}ASI)gg9gFT )ch "E6DedlƥMO;U~jY;mC[-#4_)T \u>`(Q@r5+]U< ]$Qۣ7Z̈gO;Nw@lg$ҁ9u<8ssځ?T +&]L,t6;A2$4S* 8pvB)#L%,xDE_A`4S4)Zn9lT'1W̦ v)9yWYI-QZ1blL>Pbh{pAKZTP>j]'Pca[qmD(] DmeK{:9d|h ~q'1q(>yf8WEqZ8-vǣm"Qɡ(J X"*$Qj\&+,%7l  `@Հ%`=՘5%P+&@`TEI #}`p0xX5..<.|4wqאrQ#/󲸚o]|u9_f#%` IYjTlaBvEiZBkqPm"ܜDZMm됼v*Ī12FW WU|XcF~o<=w6j#Y_YKل)I?('Ex Xo/$NF'KP ]29aԷ%qS'S21 юrj&|Y =˩t.o&c*_I䲪NT+D(c%`5CP|-J6&DQ'8:Dde Aj%OʗP|6<<^wn{wӏӛ-!FmGr> f4 |N"EAX|VPlMz dždrߖ2a6~_zl~z:PP[Bm*P'Γ2.UoQ7&O |tԞjUj%h7\\C6GFSĄ(.ӵp`+nO{"*JSj˲Qk~_w="-bon/y_s7gemZ삯bntތ7=n?A͛-]Pv9DmB؞_!#:cm3EL,޽Ŧ(ss԰IU=|(}V\pEZ, k6Qz韄yT=$ɸ\Sϫ|B|v|S꼾flgUp@ţ.kK\?<l˶YW;}GyTz/~ė_%ī6*S׸ȝ>r?eMͭ}v֏Amz)J={rYnzqN{NYΧٟpݛ`ŭvgʣ3?짰Z(gGwh?h:#\|ωW]Y\z3wڹ_󒑸p{H7}8i{H[%6Ηmpէ6<~m0[򪳊RPuh1iyS,*p\ MBĢ&kt-T+9qbO UX J@Z[sbumV2B:g/wz,P =?IFT;k1Wne//b 1E^\uWiʝ(6CffcD/S鼲CJ @׊6Fz'^'W'TQ؆8Ue[1d^[!Z ՛8"[RquL?o?^6Rۻ4X=2":_.W(BQxͶ6]Q$6yg &]8mrMVseP!ոB" bĠ#F[$MZd$`{RڋBA]WwsA ܤf?^g#diSI;ӢNqO%cwT[F|W>Z"ݬ j ާ HO5*Eec MmGOH* |-W~'W48nco^£y׃;:S|/8x-z֋) rbvP:k[@<<L){&dG9h>~Zqv>q:3uSgt J. :; &#u}ƸM9vlФ )WǙocO:F5R΁ >Y@9D\q*; SATðbD֭AkE<84BFB ˎXżeAP)*ꃑsϰѦ?ݴFɥ8D7ʶ ,Ďbvhj[^ˣ3;1:SeT1\ Cm&+O"D*5A%y_*1r.lk2Oc5F00H<"D$ۿu}< QCQX !QS L1–8{P, Dl寽G?ϿL/eRS5*-R!YpRV)"E!d6NQ2 Ұ!+ w*[UF+(_S@b0BTܚpp`nq=xw7nߦn򯼘e:W-*!sy|`HZvw?&iJ.?d܇AR*,`.h0 /;;t<ޢ>6 +B2JjV}0,|*vcW (9.[_Q Eeu5C<]4C` 1NWՏ=1ip|&?fwgJ{F_!e0,2NfL6H20t%$'q˖lK-;&>*SX`MzOBZbbiK_ٲff4YTyKև^O)!J^=9Wl;K[%huK.kuٻb(:JFP0sR磐cY˕JRsⷲNLǿ2:yqݳ_yŻLOG'^>/2X \j_GޢiT^4U&M&ߺ7Kڽ|80q X@gy;#/gĉWx+rR{K3zw+:pp^l$0xQP#@;C^}fj]ix:`ǃuQH&$V2hb8ߊ82hIHsܗ ЄpZY^܎Nt˓L P'aT<غΛnnVs3aoT'ׯs\f3NQӌ2gq-`Ivj ^*(Q4zѴ*b)7{^9547IcvPXdE&y["J!1ld~E"ǤN)! 9 qFf `[sOGs&֠ܠ\y54mjP9  ߂A yW H5w81X(% w&eiD)l)#4RAfe9itⰿc$~ IM^1Q \WL 5_ݑ9EËXJ Qya!DbcMpZqlZĆ~[ǰvz"d %l#$9;g߶&7݃ti_&M|B`RGSJmΌ6>IɚaL"r"Mz߫uk諧]O_@Qnw鳛mV!-=E8gq5ϰ99c;T)9CRFGL5"2P !Z;z-!4Q`vqז>u~P}мYpt8 y4*΢F,,9!`sW{Gg!xܚ ؔv(IzFz}t;v˭m𱓲II.ޯ,PB,>J< 5/'PkLI!dp"0o4Fmeyk5V4Bˋ쪭bh)ciÈXe28E)iR셡Aeو˲gm5Y)xiH}Twum iԏx~6g,?z%nq<#MmwE*ЩoLh`Ӯ|cxF}RI}ouixhN!XTYm,i 3ӫU)N尗%M0Kk?\[(0M߿/ "uapЉDp} r$}Q㷢d0&3NZBEΤ>G/hѮ`mK=VdfZ,d]=WTa,,_nPΔv(]S|i=Χ 4^m nY$ӫI#tPU=2(oxD*ŵk.V4T[x*n=wM[ټe4EU@}{oi4Hf ΰq|P!k4[JTξ=\&jƿ:ڱ@RIh "PtԬwՔ &%1~'X ▱(Nob/.$:ImVS Gv6蹤N̓l3_U^z.U@ty4H 5qn4Z 8'c`bҹVàsb#V{׽s)7HN4IM&bzͦ1\g 慁wz cSٛ05#,CCD I63#ׅP=sLr8 }!)۪dS KgN(z Fyl,^c86KbTmL:uDLauLPദ.PL=6g -N upıB")(]h`# pk& l`BU:_;Gbw .Pwg?Pg 0ExHPgIZAw{ Pig?Lh'ơ 9W9v94<0s(CqLϝ=4co %$"EIpz\ͼC6A0ɵ r B($0f4j|5#,RZ+Ioi<r(|Ord{֗~2,3 um;| Q$iJ$FaQd3.Pn -F0$`n;_U v(kRZ4o t(:äǎ;4 τYg`C!c[A.U"bK(F qo]o#W-xxhW xL6dd6 YYJSlj|NwaNpZϣAS8V'޺OZz9j8dѵo]0S0 ūσWt>U˲sSxdfd-d mVǁ7=(|T+pgchXDȢxeFJ s>,'AS . f+9ŴpR4 םAy|| #Vnr3OݣlW|荿 loE3'QgL(X8&1GXMKC6x\;ORy2 z]d f͉CiM$+sdang> !'ڋڄ [ֲ|&RZ随G1441 P嘸a!Bȱ,GZB\,i9z.v<*!fM)¡'&CiAfR@v!(4)4da (eBO>W љeF3Bm i-e1Nhe$F=UFlC 6e'#dxjC6'/coλek%S8wx ݤ_T_78O甿 G4+[¸QY i3.Cܯ$*cmJ`& }g,Sy0/v~ S19,؇A}쩮 ZCS94?z<Qm!-nOhNFwNEMğfQlBy8{٣u"Ei&u N#(w7Yv2\zO]5žu&ۑv ᠻɲ5~CmZ~<\;iM[fvv׼'k=q'MCx[ôg aԂ!]nMF~|?UF9$6c5~0i)I.c S)Zڀ7^8Ty7uvwSqhq + "He'M 8O?p ILم0ÙVtLZ/9^ j?vJ5H1{c"WDb&cPVܒ)P\2LpV4Ő]\-%6>_'$o/RUkޛWrtX >1ȌMuĤӴ\L֍WvWeOb>i>sgk1RQ?[nCٟ*#DUDΟ\(G,#HOܐHUf&yYZ_k?W~/p720z1S{JKΑNnR{[,m/4Ҵڑv0b0Nlfy#oT{ =~{۝l }$7i֣S=9 }:q rK~;?}P4V5o.(9;=|,sL%HG7ҬS ˴e⬸T!uJ0 xukŢ?TNgӴȵ>^?/ {vgl)x _f <ņZeDQ+ R=h8'3]vƕ' ]T7M9 xR*@{БZj預K=yf76fPti$cCL:8J B̂%޴fKsn։Wcڙ=:n:yqeFKa`hk$^ 癠`MZy)kZݏA)dVi[eL_r-z 2A Ȝ ZT> B K\Vf!,|5)VfCƋȨSɼ9bJYB,X,p :N~w^Nl6uȆD˷!w- v[5+(vɻ߶37 -'q&'t,2he\f2T;]9*BgB)7#Dv#->r#ϫ ۮ?BQ^oʶEZτGZ66IN{k+.3V*s阐REZpuOx zZ'`Gi{]G:֠ Jy !@SSr 2p C,#n=G&FGkr|skZ)SRÜsuv_4b~soRr?ƧO[_J7Lp )yQE2xSfSYu0 LIwrPV!0[pPv`hL)!EvƑyL<()dL`2ihC,C+K̆']-Ƀ1!x_$zղYsEt p <:lR%H2AaV"4:&voX,SVcoO3g0`+z6V:[IƤ&NdiH h–`KJKkN6\5<+Q3i\- g)I`H40K6@xrʁYgRZAr-7eJ82d>$}16IC0LIQ z:(T$5)MWA7zٴ % l m#꣌Ҟo'DJxn)F7a<<^& JCF)03pLhcY0DJil\d} 贠OжN-Ҁ;cy͢F<^CKtԦ2 ѝ IyJBSR)`(f^~Ek-_G҃X],Zuwj~~yp<=ozw~K9pNuou5J1gI·-ILļ=i[ ӓ<M76BwTyͪ^}{d-]Z"nzcT]^YuAzW;&BrgӛǞdm2r T.gʻ$}qΈ:ZKt1VO欰Is'Ϛh{l_𖉫IR{t>ߓ?i AO<>).gM{.H, rvz۸MլG7AzrG:a .϶KV᪯W^0_J1\ k|'w3y-cc1;f/(o\=ϖ0Lcw6fqQz6ufv;еZm͋yo]}~k| =\+Ft8O j4\t]}]眳_ΖVwC~i+1"7-')MaqNj5jW)vAt))9`Tū|>/to7'H=tA(TUY(w k_;m@iU?*3]q?A9ʵ8G=ϛIg"{i%ɍ[("kώN[<ҞN瓷xsx$I''}ӈ /q`~qd8&&݌E-E3EN-#)>wF⿙1o@nE[vsQb̪ݡ1y/c- K4!qx sf@hFii,\Ufu1DR%ʦ(ϰYIxr먢vAhf4Q-%a#0Pg438h,U2Tbg+3%ZY;4+hki\q_ *4>S 8]e~/9xQ=llCvxE_x bU.>}V!fؕo(r;Յ\l.T~<^>y}4`GBDQAbBF)@ 2>@貥(:(BT.\K^,gb JSSWt)rC(H[v/o[k~nju ї7OZ!hV L5AF S^JGL*s9Jn1`JH(ghx.٭Q#22DIZ$A [i1c.rHr8,Fhϔ&egu$RHqb(g\Yt Cd@vudp5AP8a:A Sn4"RQ(%EPӟX>&DJ4h?(rȇ2>ECMbVXNG?iV;CRD@@-Q+r|CBC& Gr69H' 9t>gR!ÌN%K2R[hXT-W cԋ0ذE )8@DD2F.QihRn(c9|h}epoc2Á'9ÁgS}]llnTj-jQ',Zijḩ}tN3Ȗc<s9e8݂cT HD9#m8(u\q'0i) +b zAy`ՉM]rgW9p9;Be3F ϩ:8yU*503%N0BX8i6ڤ({pKbr**8XB H6LB%B`@M n&&b쐥*M=%Op8*d߂Nv 9?M6AIW9v}n/F狰.6v@ٻd;yEuFE-;l2)hTdWUqdQH)(Bs=,&W:JDl };blb‚*H$hS$CAb1dDт| Jty{ZgT.9L nUxU9tzIRI*\T!E& 3A({$S ڕ>Rx8}A 9R?EDZ鈈#"Z~T0 h07rPG&.Z?$MpѲ6*SA NmRKg<&i*C,َ^.G\\,%"/lo J" B9W VGkF#*DF4JS =x8R1<=@ؖmOFGTg?HFd=7f?~\R>F!`!LLQHV7 TڱQ(DiL&2R^*SW_!\im4U^v֜'ԡ?l7\T D#tq=^sԔs] u #ut=E0duwsma3Ly0؜}8،2͙JGl.|E@ OnXxVu~9OIE%RNP䠦,RڨR|>F^Y /阖rBM$ 2cW7Sa>|\4s];_z7Y~f]xKJG8zR%Ȣꂭ}覟i7M*]۷}tHi?NFYݱ ׭B5ϟuOfES*$ϊXHEVhQgw2α̽h*yBeRV !|QC.gYNF0$8'EȣR"OeZ$ qj|8*=.JYŜ@h8P 8-XO^/^8Q A5 (,AG}I(Ks*:())4#d` iF1xM3>^p EqtFŝ~QV 4^ ŵ DmQ `\핐HМ E}u/q_vEm eH x˭m"0D2Fr 82/OR*=;Ǎ4t%u4qZ0t1ST#_VF)r[JWi@n SdϠ^U>aBs=%f%wݝGi8KSh]clmUu#t87b68K]hmG&iWzUJ DQESIbk/9ќy fb%m6[PInA4ʫcꏍbcOogރ>giJhi .Ct2碑\[g)pomcwsKyŽiHo&njU˒(%q|;Kzئ$۔-' ofgg|sWL#,8ZI}#DkKM5/ 6?b{}{۝lK)oCj@ه0̦.|(]4,[&}puHӫͿчa')\q@ƥf;>t 46=KPWf(֬rLHvc}~'t=_mWiRӶv{"$CR肜?Qnxmi@GXsʭ5'b~ pS\KqM(59Egq1T$U66$X%R<& K 8vLGC*e_4‭@é SĖPB :1#4'(xAcg"go`%)=L/ǵkym\ ] &MNjKu/|rkf"\]0]1^}"*X9hk{.GvJuӸmַlz^Ǩ:!.=bToXp4Ġ B,8B RI%VqIh5Z`(3`qgт5#*` s'(X>:vπ'46U\\{ED?x]YOdY*BU Il2$rOg,HTe,3(-1R{TMoeVz˵ιg8gJDsa-b_/G]wcq({ҲGe`epJ^CVjf "*LE5 Jw/I|3+rP0Q)a<_ hު5uB&8~x+kqʼn_#><'ٛYiF&F>(mRΊQ1qJvr:>JFl}ʉ 3+"uΌ"BViBU<4 ChJJ烨9h^vTl>-=-![oLTQ铩8m=/;{Wއv 'i~|Z~Q>KH2TD|:+dP@ k|ʸ 0//'3s :9 ;m+\>lShmMr ՅꂷTw;hCSj{aEG6fa`Zwvy9Sz^ͧq=$ss;JgeTu=ox8]nU9:-K~M]r,<Ɓm:Dh.X!a%Ӡ3@l|:b\.˜yr%(S*$7҈?x.;0JyB( HsҠ4g*JjL{윦"v>s ~/ن:]s2֕խ<;ڀ=E:װGfPQRP q@u$pu21,Rg2D'Hkƀ>$% XZY[ðqm[k0+9sT0sLgG|hF}=Z`]Ϛuhm9xnnvKg>8F|_|O ފʹRH yᝥ'K+a |*S?<5Vʮ'/Yݛuݪ!(:Ɲ##iVlke-e` 0GM,GX5u[m%3_oLb#6΄y"Frcm~Km΀BczŸEfaO;8cb1!5(=\w)Sx8?y(i$j歈xZ|.nzϤ8}J$j !"͔J"0o4Fmey$^$BZ` ciÈXe28&IE xJzI%s|X3 ywYYꮡSbfҁ-p [(e3ȈK+R8QvE Et3L}'zEO'y>;[HSN"KD ^„!c^aARM[EluvS"VKЉB3D$PPZ#RӭkEd4+ƃ5_:;o`uh6CCۈ(j۲Iv'%ԆO }2s&.zK-ܴbxzW)A~1z^4ӫl 6{so&oxb[O ~0ՂR~ Bqd?&.,Nӯ.1,{\N寣X1W(ꟋNj| Լ53wڥE}Z~6 X4F^{,i!f^wf㕯j^W6XOtkeGX)L"~d<6Jdp$}%l K h~lj9e`J+ܟ2o(d7UY 1'+{pٻ%*]]j_zzնaҠSԐzYa4軫;Iql@}V G3Т!q ,mg:,Mv>oGD@Yu}0EMM `δ p,Ai6|scJt`6kL0|3Zn5"Ř1?0k7.*oFO6EZ~KP׍qSSſRel[/j+oʧ߾:&xgjλyOZ/U*A{&iN40A;ev!S3hwcyIE2N\=k]͊ цbxlZ! )q|P!k4[JTsͥ/, |~QH-mŵN[L&{(hL*(zJ%K.lP#-b@i,M4kEY^:(M[[ks|Ս3O߿B ^VrcuRKRwN_kxF`"ODϥ6PK8ۉJ*_`m$dO>L? ~p;&T33t9W#ˊ eJx{e-kvߙߚjH#r/;΄Q "H=d44#XQًuxKM)aF28K)µL' 8zq?;f̮8GhlZ(~ o10!4#1~HMÐa4ifu )~8*Ө TOZ|4\],_Nm~ȦQ+Fb"CK]Q̒8=?T X:˙a9髗*|x՛SL^h Lk"Aۃ_/Ьazh.C6z> dN+4fnpK$y/\ dC ڴ/ԙ`Jz3_2?|R k*dݥ/E"'iXkD>O^#tn xn:1XN6|H`R i# Dk FKaHlE^^r_pFj')!9G+n^L Aj/\0lH68_`j?Y-9-_A_AGe jUGNNYyde3o9٫tbǜHDNbv=,C25exZ g2-vlrOs8}酙xY͝S”BnS,%`\:$,o"ض%[^*Pri Fm8I'Jﲐxt9%ݟ@ =ʶՋB R ^TCqf*:MPR5HjDƂUMrʒqWRcjgfqfMȒRBLE42=):G&ɽ6.#Lq;/Ʋ6vW_Ξh;,AtiYK%BM)arS%]PпYߨ.6 >j|<;)Ⱦ^I8gX]'< 0dʳ.EczvY{k}Bt~3|-e`"t:e4Ք@# KS;[6莼vi`oiy@/w%:i0xث~佃.379螽X>=slj{Qe?NS,qR&_ŲŲy)ҧtÜ_rcfHL$9NYOJ Db^GFϝt3|%|nI>c=8;2⨃=L H[T%)Qm4tRR.۷ N (Q܊F1$ON#P@X+p*toWظpȷQFh/15Ė@v.04s TH"']e+lsp`hJm9w7B2iirXV[C)>궸1 qOʱ$g)47?g#)#r.4j`rX {dͶ@ڋJJ-\b<*@f3q]!fC(] m2ƣqخ{HROq f*fvY E0I㚪2+'[͐vbzL 5 g߲s;0.&0 Z [sױn9ԘCٹH($vh5j?8bMBfZh?JY(%>/QT`5"9{S 4^M'^I1&f6oΤ+8lf)s,-4T RE_ZhqT拈lõ>MEv54]&`XJh:@},rbB,;`)XQb2rݙm/biȮ֑gN/+w {dZ=2y 6VDb\&$,&ϔlFܧiq]wZw?SJy wKqX2[zÓߟdy))]j Qc撀= =PD s7Z]*]uG4ylG[MG+njZ3'ޒ7-!݀_)b){d)%q2AN5Poi}rc(yA w3x v\ITO]h:"@-WHE,%~O)w}m`FLIh aW":n>Rm@5Ɣ5@=EfZllll{޼wwp3SLJOG򻋿&ы8Mpp6E(9{q׉TT &-7&ιSDڻ8gأ8g)(uqcsT8" RT FBJoRy35i-ɥ|nM y,ZԼjF{E2J%%%ټc {?Ҽ5e R4!p(8K]4㬆j?3f=zB:T_L|,A KufCM-T~=]N7Zjܟ\d\9Wvuhj4h|7<_ۘ=b/^'|rv*i/E{0-b ZR @YU{8d X՚S|oLUGD b@Ҏ zZM5֞ 4MpVyH88[Uf4cG,D73  $c;46/j!?:<;KJ,ٸ[ ZSF1X^#O =D4+bcnF߆' Ky!+w\(7ʱ]4w yɱp19\ڱ'nA{ܠtC1U BR2,)j-t[9F1[Jbt)x2٘'+66jF#Q )j9Սe(0\+"/ /p3KI>UG3tU؉fݿk=*~\ȭ1A^nHHXe=nd?>ScΜ jf'ӒmKRZ i#{K!%ǥ%lHXwte?8c-'\GUGn{hu+9|pr6n_n,ٓAG(_o`T;}OY+Gm(]mٰ/{懃?~/'wU. 4V0f 牲)9͓tP!E@k|߼B!sӊh7Š+\L-%S_NO/U[.#Hm0y{O:7`P@N>_{qCH/ rxK Cڼ$ ox\H/hB{1p5昴)\ )i<&*1W&n/peR,pVL#Uhcu~hxf}χGGcĆD܉]ʦ[7AgGn_yS?]܇mpr4<89F ߍchW  T(ˆO->9!dL)poIo:{ZW}vuVsv9M/Wꮓ5`ܱq~ZمɵO׻ӲSq vӚH)x܍?<5s|`1/ao9=C&@*Ro>&T ^X2EtzLcTFmU=u?E{#3#%3hZh@]]sGv+(d]U~Hl=a*/ꯑ. JIAAP~ӧ^E4$w򷶸8K}d|HE*o/}q҃cFgdɺ5?}P97͵$ͪ"h7Nf{k%xN4dh^ z@r}9- 4J5ST:8٧^)i0f;Dp+ ]WH! Q S(|1d:dy;aJ$|S(_ w6XS " *ؠdxiЮK=mV(5 I&jeW+e$mֲ±&6-tt%8HQ`w#)crX|.H1?&`Ko!\"U0Vj/@fT `6+(\]8&X( ꛐI:Ci*dL'B< TB2Y Vj4qnB AvEm%˲fH57]\?0\ͷ6Q1j ȄكHcoM|%!^eԭ9+ƒ] syt_V 5ȤuՅ:@ 7tXTu~T% 9ՠhXżMC5YK1Z1ة`U0~Hv͕wAjy1j"!%:]%@_>cLG>.uԘ jYf}Jh v% Aڱ", "+P(vׁ5$BjF,]׷tƢ Aa8J"Z!k,BJ#\,*Fa!dً$P>flqR(jS,Ԛ}-6OtQ{H$Q5$YI$eoQڀJMU{9_ 2QIX@iT_}ŷb2g* z0dkR@z㠝xnknxL{u:?o2IVa} n=tU:9 6IҢGJ$owK($JHu6tk*KBq$RV<4z&fcҎ,GÒS*3bT2j7LJȀ-Cۚb樇 tyB1C[uŹEYf1uA;(XfhTYڂDe{!kQƮXBv+Bq&SurSpͭ- wH._* Q 2=PʢB6:XL&]~5(:W (]Iڈ lhAgМۦwZ,zFO -jQTAlR>Cɗ Ug&ddH6#TKt@:'/y~5^h5у7[A[_%pg^ kJ^V@P8U8P(-l4EP+Bbx~)AH 8QZ{*(=’#iVA/;^#@.* ׃6FTS.\{3,H1ˡ옅jIr)x* i=uc't\- 5uhD;S`]pB{&[h#O^Zn^+[x݇U9ڥ1Vد 4n]٘7vEt(,dN"gzEʢ9B_wM>R y{4? q#Ӌ ĸ p=_\qrѯhֹ3).wtLkQ]F6BȎ1`Q}3d4}g]9v \FeNlV$(?]GQm3Z.zkJE:KgcZܲAkWՐ{3+Zvdy`~X5Zv,F)3goo?|;/o3NN':{L(;ۊ&T4Yx"v2 ~+9Ti+Riy'DWt4)BW@/3][\&Pyw_--+ t&tk/ZSt-]5v9D)[(x.B5VDuZ̄8fREdu-dzB2 רBܾB  : l+T hs=LWHWd֒SZ!a:tM0S+BLWHW6Jk܄ "2} ZC\Pl+a:r83%u^uyEh?ʠxGG/ ]0v";]Je^ q2tEp3;І_j'׮^ ]a,7z-0=3yjKyՖhۡb6@j9]-t»z=ۮtE ]z*tEhw"^2]"]W욗Pwaϳ9>dC+GG^K?]^:g M͝ӽLNIՒD "~>?޷G7?݌[Nr:Z\ůW@.g/f|Q 9Inoh袌k_\yӏ'3áV-?oWҋvNp;&KC߇.|cXbxТsj;]PЄ<}C84zED/z?KtKğhL3W^-'{7pЏ?ίQqY>e^FAx ෽.o{_7}޶쇿(KsJ5z`ANyxK|}Cv7Vj;Mp'!r#]8R)H]CL^{+6![7&bFI\+tg&tkQZsʀmrS9tP ]n+l6he6{j{޶+dl}s]E=M/x@vea z/![/_~r>_t%Kyϝ4G}GeMNߩ+uH ^ܶ>aԆ4\Ff)mzHvP"-7owJ['cNJoG7K"DOCm˟W?cp-{Uᦷg6~v䕍[/CW7)z4:Gq!<r&1rEwrld9kE-^_ܵug_SoV*zjiΌbmfWPΏM>k Z9?v:C꾉2^6]6zlt3M!'xUgqG1z;^߶1#7h@<~678jvgGO&d&o*dߩw w_GFWM:PjB{l$1Ne>"qB=_Y:5fjϙ_}.˹˱[ kE|:k9d6dJH2qE 0ȏ:9He"LOl<۠ NޢZZGmب0J7Jwʕӹ;Jh]Uv2tEpdwWv+B ҕUk] b2t^B;]e)'Ui?{gƑB%w7r?Tص}E|oA@IYIԊgq4EgZ.M#ῧ]SF$z\Z?TzU>ʛQR pr q*1\W$M  Z :f+RiZg\WbZH> v>X.\ZU]\鎦Gpp\% j*M'ij@Ki*sttSMhpE2bpr+V{}*s +c-8\.7֋j2"t]Tp&8ﻵDUҸJb|E~4Tjbm,2@4kJǁAIC)F9$QX)q!T:Kø g;bprJslgPp\A-W,8X1"4ђ+Vk|bW=ͬ]`/g2rC+R+ bએr΋{+\ZcE*# \`k\\:v{\Jmn^pWQ  (W,7)"Fq*m,!Wh$%2 uJ̛AVk_bX \7Gʣq$C*M;$]4EW&Wꩦ> ֮ubpry7MmW\WƢjVX%1]$5̴+Ϙ[vA+l ʼLe.5DHr0J,Ó0n@_8Xi.jM,l.В`fr$b3c>5.I-grҖ\gr6 $ FA O,ŬۦRY"NقZG+Zi#Wֹq*+!0VFX bpr+>Xg+Vi +=hIA]\J XUbUqJH6r֮X Rpju&oJ zP[IA=iψ+k\ڠr\\َGJaJ*I0¡:%=M?q i*[PW6WꩦZւpł]+\stEWW{ \`+(bH UPpG\qa IَN1M'YדJpH0!湽FWFku͑P%mu4(iÂc7\LsXq ҖqI4ΈV Xm4T5) 8A=4?7MsRpjCv\W+ \`}ir,"qE**+o`{I W,7(";%MqW=U\CHpPr2XRpj=+RU/qAMѡ\z}F :*MvWir'Ij:n4: pWO5F$2$ V/0d+R+Ri,\Wdwך[e^:fq7튭K)΋!Bi1FLuKWS'ɲ>ˊ“&Q| 0PpB;hڍ0 ɑ`QLZ139VL39VLp&gFmqE@KڠsDUpG\8Ab!Y%ڂ vZ  r^\\] =X+#GVH0z'W,7:)"nUbB#>J$\ܠԶ_8e+ViJgqVR' #g2Hrj)".\J(>~ \`AkW$B;hXSW/WxJGuUx7)*N\aSM#FpŽ(1bj Uj/-lhN)bbuu@n/ѕ叭;{d.toUjH &n`KXmV4q( W$xF=XuRpjW2\WJH0 bpEjQe+Vٚ\Whh$EW,8\\F X Ub++υ+k\ZTUzSpC\흏pE=F1b>J OYW=U4ƉZj'~s墘76f]ʠKtrp:ѳhI}@x*Im8Yi*s p jzN W a1bF Xq*cUqeU AHpXT\ WZ™דJjv1RC1Xv)[1׵"⽂%Y9be1UDZ/ήiS6nnVTX3jEQZq K,fDLVnfszztq}U=qs{﾿kY6pj̻ yEs7 ngt~C7cؐjq~9'~zH7PtW뇛 i[`ZT-_GTGA1F=2?~q<+m̭ڰ9#TK0U]8j:]u5U}u:9cd4KukWtu'<֘;dl517VDHr^ٱ\4RXeQJC;&eD]~izU栉iK}%hM"#CM FQ9>r7lAA gv!&\|5?)"*q~TWyt+Nu{qˆxT&LyS=[FcRNCD~:gv򏏟6UM.y1.EhwnA0!ե--q:lCUOSIPFpy:/#j_.y9#j##w@l#s|h[Dփ4±Ōp$7(%ecF>±Jki#\L0`$ЬޥxrG4=Ψ=hQþ9#f_ۨ4!X/Ih$jUAGgDH-ɧ“B'NA9,׋v@=tb:Щ.P&ܼͅFn\n]%6z"c` yY9Қh5nM7[IMPU!VM~;3;y `$4 zxRF-$o v.իTW~8a}ɥZtwK ][ Ƹn ~͚TktqH¥Y%g͒4KVX֒pE_N(In8tMT":rBW{UKUĠ+ŜCj0,9O/Wr:ߓ"(uiŕ< z ,pp jzsJH04Ku+V ઇ2bk"i5IMf9&  _Y֤gɭ!čG$E$MsH'&n`N4F{@*.Ӝ> [QHu(W,7)"Bb z++,"\DW_Df z+ DIA>:w\h®l+Vઇr+Lgi)bsWqE+l\\Rpj1+VTUqMI0Jd&URઇ !lzsq/xv8Xj{=]6)&JLn9'jݫ?ߝ\adzU׳[4 qJEmRΦ5}E8~ Rzy;'M>zx؟G|0ܚv|NdO~ZSy2nήɜO+g'ɯ5l%ݷ&;}VPQmِ'׹_Ovkmva&N^TK8!s#~7¿ONgܝ2kn M{?-ax{`5k"]fWg}L'D+`jd5V0Gj60̐FП޲.ߐaf?ͫ |4]k~/n%=2{6bqqlݻO&l*܄1,hn=}[vˉ2|6^hpS"5ޟir]˜CL]٬,ѻ` P^wd<ݥJl?L/_,[OjYo I>섭S 3vn9ozf>ð Lމx72Akƿv09\@{ZP?Wtu$:1)mlsFްc3Sv\(k&ʨG ue۸7M9/gf7WU1dv8Bf*VcjfЌk0c? ۸onpp.*Db>P]_]Ͷ`t9hd;[&UU4 δjev(Cv W!څ%۶bo{, IO_􄫨0bsXN8:+N/.ӫUpOnZ_еToݻx'B3Ϋ vxiN2,~"jUR>y;'j3Us#j䮚O VS3gFՓhZjORE81kYI6Zrq&Ks/P$N'HNlzju`'!m-MwZPjhi`hV`Jeu )U.BxPwJփ\eʢkbGϊ|eh 6v;Ƃ9Ҫ f}eC::rQƐ7w矦wյMn64 :ƨCi 6Nu[;lDmԇaC~6}pA?돃7WG_H#/us5`𗯾^m#]HI4kѓG*M1Vjhĝi꠾0\s/oj+]W<0"nᱹ/5̀ykyڋTx/g*eQ YJ@{~UfXVޚ$XMLvӥZ [^f8]!M3ǐuh֘ ڹI rG3#LL9!&G3Iĩ"gYIVPbV18X|bVMxSbּcV萰\ۑΚ0i8ttMc A@.q>(eΙUb̊ Yb4w!m¿mX-ޫ][==>._>v,Ru.?OAߜM.z@Nb5dj_ WS%m[.t sܩajh5LmO#oJ,"W>:?fgV 8cQ#_8jG}Dm076Sj>ŷGwK:oW]e:ۗAo{ζkïkӺ{?q<gIĬ?qͦZcK][o#7+嘷%9{6I/Šc;L?nnɒMV dd>օUŪb,V~x,Ă%6$CCvˡ\V*Q,bZ#`cCv˦9NQx; .NF Ib`DYN"+Z|F:qY(FղٔϜ4ʲAոg˶7Mņ쳺Rkw{(=7.k>̈гqb20=)`ق .jp>+XʤGjMsa3ի#ψҷsYɣLW(Ѹ3N0Ɯ1F#.fIAk9P?'TZqe3gNj "Y-4x>èܷ}VQ$l!?.0?g\F6Mϑ@Ԙ.2qe2c6=pX8uWC˳>BYRS,c<ga]ͮVFrU^C[NsC|އt-|ЉSA 1$fH)Ei˲^7>(2@\k ,WYmC7|pOxoUî%,H(SN(Q aLhqAVbNTASHJ`bAifRf(y^\%Qd_63)fgcwE(>ƨ6['ͯ\wVo_. wWf6yϊ5PT8+)ż%l,ϔHrPQi<Tw#<&$jBe~ƅx3fISZ3K\D坘X7𼓋@mnb)19Ru%'0#l[ZlrY:IbXug$(|4QL RLkz8pIXS!E3oȬʵ4/jFE68[sZWIޠ]86IF;a84d{L0>N!&=;ƺMel!f;Z֑IG3X66Ę̌u(T7&뱊?J߲&ɫ["6Sfv-o!9hղGx<rSƘSrh΀ `LO: `"@IzW!m/ZHG2Qa ύ<"Mse,3i$ Sʢ(HyqqfJSQd*KyOc;XՒWN,jjNxeIH1&VW0:OOFO:Ѿx"3Ӌm,84۞LS&X|MD6hbx.ekX\5p߲=mHx4^eU,(1N IEN+X\/"1Q7Ӑ d]huHe\ED=h=j?Fkb,j.LJZ&Xj9SAqNP~_!$bxDt3lʾǪ3nL 'd%XbIFP˙  jل l,jNqYӦ? KbٻRc]<۩g5OGcyIzn~LL$`KSS˙ 5PdL: lѣLhnxp&_-;\R( '.1Jҗu1Q@+$@7DW[6S ;Z67x}VˆX* jгkhcS-[k\h(Eb%pIda3ʉ˜ib ,;is`i nVvY"[r}&&fWmUџ`Ȍh<6=9쐻JP>#9f=^.2xvFh2.4nmH,)̎9(u~BO0#KdK/T,GYᅀvmx&^|fiO.:P5}hZ*c8/'kZ]L M3 yQ;R$Juh w/5B=6xdf_?)LBF@v8PT#O놻!)J)S}Y,Vk5 OhfL|\vo [[zY?}7j군܏#'RTezelْ|H&z fUGƠo,,¨PyXJ kv}_i4 2ϟ(.CnMtiuMF@Y/ }xZ{ٖY`;5dJ)WkD|du&x"vt&SMȘW9XH HޭiOwҙeACrx/^*ڑN 車ߌu3F r㢧ώC5?C$H<+yV/~8̢.xx aޱ϶JnbȥC?@G*\s/7Gn;E$.x!̓]y(O{ 3.z#$/%Jٗu/I{HΐtwRv"àHe˯<&@SH@ = @@taϑХ伬BYy7< ewg{uqG>MI2n-2K }y1Bb,xВO4Ӫ];X.|uc:Y̗G-2J.VbR5?9 =,{-=5eQiEw;JF : !XN=_Ncar<4S43M2d|G1( n^#юrvһ!X `8q4>$cQxj"LHsI\oA yG88T:\AN|^<ИBCPO8n+e!O4DTE¯<tx] AF|Ygן} QFœl  4@ǡw=rYፆ|7-ۏP[Ytn `XTOD\zwu8"I6TʞQSi,/R)!' A} ѓZ g{]:l0xndݥCҘ-ztH]fNaJӠ:A (&N3Z$K{yef(Wcoy/ OV\mW;( Zns(eVh1S}ғRbPC?Q G"eo30PjB` ( C2UCy>k5/19//=5wyyu]𢚙ĺ !ȶGZOSzut\1PAm}Qb3AT3&,!\iU^VE"'!;/vΖd/^$S nU̦QLAwGl~? t`W6*C^hSèK<򹐡4`G:%[L]$9; ![XM mB)?[f\Kǒz e7E&C(GA'bACsw-#𨁥;/}:8-db]Ȕ[Hu`pçT-JFuaNIBл:[>+Ok6Nvi,etxPMt$A}_Ӱ)םɦ{A{A8~vΖ^ _\ǃaHuvHGÔuuB=>!7rRIHA>&#MN34ˈHC\+7^=y/}Ijs"y2տaX=`U!F؀MO|B X'~sahU{ۥԛγJFݪywEmeWF>d1]`YIC/Ev[W*}P wYc|}7ή>Ln珷~7ߺI)Q"qQfT ] mmm{~_û6O ;3DP B)@(%TSXGl=sut+10CI"^0ל&HAE39:i-ټy}"rIc-?B2 1ƛ_-_b|Fd}yNqB1UuxZJ)$qA%S"əD>c[|y $tA6&+MxQ@u"Q\ JO;[.o>N8xgp ]nOū4!'<ܙ=|ɻ^ԬGm{re垇.Ճf@Ft뾻Ym=uEbJõasu1c%y8E&x"5S6#n4cgUyLky83׿ǙVGN'MIBR/`[(~v-^t(wPd{-~zЉDAQDvUD#(f'|ח/]wm\}(zTz63̯?$-ٔIJ{O2"a" ] i[)UͻCx<6$ mZ̅3riֻdڒ|5@&DW$m]R]sZ]Tv7kkJwh cϑRC`fq*jB 9!!$;TX;4Փ_G[3TXŜjd ?TPCwXnvKXF; jH/qթ'1-ziASF&ɿN3wV4JF,k#ܱpo8;#3Hբ ޠ1l6!ºxfAL O(LVJ6 ΓxԸm6:kKkkڊݺlH%){=c;qW\8̐嚘hIښzꧠm tv`ptE1J34lei+3G\ߘFJ,4NO_CT0&+}:ͳWXǹa{Ds_ѵ[l*D-Pb \;l1P晧/iG/-sn,Y}zM^]M-tȹ͢emίPɅW|v1?gZ1Jw;wZۭ+{ӂuW)N w/6?u]^{Id6JڿSYq٠kV hu$V ~vXJ^ݱKN:(6WN˧XtiٖîU^6EeVkq-ꊅ0 '&KS!qSۓy3]v52PXBgnePK8lGpJ<[Ep!plmi:cC^_Pu_=ʗXlgd}8Wټd;L9c EJ aйz_.WtonۿPΧ{/&P"UɌ>OX]%Qr z:Ju ßU<f|( R6rNҴO e5>R!<1C9w`07rca)D[ba= m=P[J``/g 06Se2] wN|R(Vl ;`LBAԨ| b>9a 8`х,lU)p:C qؘdDCCBE{ |PJ+)D"V9;H;dӅ9AٛPCyt?Jaj]D6G4K;}(F33O碝8쀖Y)cZjuq=z]E8~R*[A XfЪÄgy˥7V \.sب%[:vQqbe+ܥD{كG<ޖ hعt04W=+u'2$f>uͳ<=N" ѕ7~۶4]Ƽ~"k4лvʧ)7J2T4,efѸAD?,,ھGb,~Mb$p"N4A"".|ܸuBNl.ɔ\G}xm߶՞0 4(%@k+8A8Bh9I4q<޲jJdfU3tH$Ǹ(eyT<Ѧ:eD rAbzS_0I $2nqk-0̲ unKM/x P ulQl$Qgia_0rw&NO,Ҳ컓βSL}|('߃N>BqVWE[iZYQ!̸,wG/WXVEk-P d68YM~QESWyh)1Q-pQ#%C<q&Q,sM )6>Uֳe*,.+ب*dϳ xx@ ?k7A+pPtz wY ȸX%,o-TcdU:!+{y1/"K/K`a{A!V^2o[r2m)p5(BVYAR]Si`BR =yfH.bs5E1774Q )q) !'ZJ8cX :ƨh)ೕgޠ;&PW#"u 6U ULFƽ$1m|P~Pq#<.y]OM3%DNn$6_{,|>Vd,c?̈́([*k0 h2QF8ŇĈGPٍse.ETٰ;,ӯ0Sxha;1s(w> LY/+TU|AR+-K,ӈC)@G 3^aB;vp6SŒ:2s!yEæj~-ДDEZUP >k$jd <;|c/*UBDV+AVFfJi2mRdUJqAQ*(^Ǒ"q&TWOhbY *Oz?~AMؿ6a؊џkϧQS4)R^sT|Av2Ӥx~-d#y"Xj-;䐵J(UJRؗr;xs0d^ 3ImY&5a52iG1Ѷ'X+oXn)ܗoS|Y`Mq2q/sñ6\n!~ cT Wzn=UJ*pTTHsS2Yx-"ޘF0ִu l^ۡ08&v@dK-t&^S%_#> {1,\bW< {ɓF϶$h" {§1݅_ɚ6 1_e;R-ͰhB:vJjfԄdJh`%o<7aoUTP# %i統AX\&*aa'j'V&TǵL\0.Y})1"rzYOE~?"u%b]^. 5Q Kk1]?;VY? g%KjfIޛ]%_E ]ؠקٚ}π!xfCS30kg+O덆€;pιrj9}5(#m^Z527IKa˺i7k/=H3Tq"iz8U=BAW(h& ~09iZ4eE9]AZL{G3[$럫}Q.tw!B 0RA FBfRߧH}ko雹ٳ"[q#[~kq\hHAYOZ1( ˀI%8I$E fɴ A01B?fc_$mI5y0)u8f{: څt? ژ-8t=w7]Vzjd@p k_/ѪKo|:'5 ?ۧ?k dʬ2ŷ]xx@9Nvo6 ̼+Mңc)C4K }p9Ο2R4xRf/3x"=8:VYy5!VTCÒqD{Kxo69/D ~Bt BȾa ͂ZUp!ڿ<&XИqn\X)$Ѩ+C':6Z^rFd5Q W̷^TtXBL$P ;TF188LOm4Z-6i{I$e6l"oVkdτAlh ]tQnH^,۞E!F;gGܗJ,JsI6JRD`Qcnn-6ᙋF`) D_l;f-pPFZD;!#mc8X(R׉ яh:bY}Ygh{8? f+;wN=96{M]U:*5Q#.K ZEN>> ᳅YQ3(yqeOXSulcPѣԻM_z= {LD**TK'd k1xFW:4s'bjsP1N%*LC*"Z:J31KYi7e=2tZeg ]]⾂vLde*CjyzHj W&$ jHcq*dSvsV}^Fx!s܎MyߺT𮱀)1fsD@7VeY} cŻ֒sQХ>{ƨПNYJK#XsL70}83cƫ Ш2&Dzm.n8Pq+jrur7 :^GF{,zLI7Fi|c~'wɃ9Vc= < HӢ1 _AmqsN Zf0 VvN=86 '8RV v6h&] ]a.T Fwҡ7:lb<ݪ}c.$ql~X{6+ a`2> ٱAf&a]T!S 0Fv k e|whaL[F&FcIN9/茗L(7ؔ4t쎻s"L Bvl=g>L'h%7 fTU|'-;mOOwAx^LMy`e-t}拴?{m=3('d7|Ob(Л\-zj6X<zOtO\'v"$,}п%Ժ0='A_ŠDз.۴7wxx@_\ǧ0:69=p|DNBapEt) SP`yG|F _fG = fkFu,Wg/pَH-/O||x?ײ0ӷBw'o1AL:C8ѫ nrGL:+pH<}O>h -YbCU_Fd: .&I CL1-bDكu0aSv|/v)a+2X@Gۯd^;DٟF2j8.'FaXu2E)9&tI괽d!=<~fBAnQ{ȟ>plDK='IS*\/1{(~8ӗPp~__\^O'2-GPq0ҖkĨfBècw9RF( JJU9Q E>{ӺrOoS!H, F8Tc܎IL? uXPOr&P~6Mhf1-e7_\/Ns+؞o{dM] /jE:)%j Ol".9G< #\buZTcáO( )Syo2#A\zѯր22Zb`TH2kŖb/@)bZZJa<"Z оPHUK %a@l@"MY۟o{Mhed|Ƃ hKIPRTpA0As06|Vz/5;-%%ua&%nˁyJY@JaBIvzCp6Jxʽ7$ m 74+VGlF}bmO.?5@U̻>? CktVIGϠXmQep0ԤT:Uoxh.-ݗhZ{LPa8K6p*gIo4̆޶un;ojޯR~rkP*-_慼L07…=*`C)ґS_St}yq(7ʀlaX(&3u*Fٛhrn:0Mtj`QQ 5%T{,9o 1pX$+[qpAOlgd\. uJKnR#RD4(2daB z"i 5 Y%‘F)Ck["Rр*4݊D{h@@)&;ZP(Ƨ SR|fE$,Zf2{DͨDB]!2U(5em L:оhRʲFX:{>XS!漗P!bVQ kM/N-pUrKWsy5✶{ɵH S&QX(j?MoU>9ˍyyf)97sQ qѲD EH|57-Q'\ii5wkVZ 6 ŝ[0kN`N '힜TBS'klklBQLJ_Ny7[>[P v  gDZbeq##=e+ҖS]̂}w+5m;)䨆5W>8V!ʅ,J.|CIA5?vB$iSʥ^(sWd}p7fӣ~qH~Ō+˄j1<hPEc#j$Z 1 1񨪶A&{LEL<|VeE-@|ӄzfbft8F0.Hw+ nbT㩣e[.*i~GڵߦcD:'uꙕZyI'%ewtLϣzz3[sg z!}%'0.`}$$Uid,(xgqиPVu2fu;-:o>WJƭ%-SҬXq^8OBRL.)4_F@8\m›<oJ:ASII.Wf ׅxHLa0QL%w,`Xi,{__pRf3>+Faxp.%>qP-52՘&gI9?6ϋ>.@9g m1H`g&ƀWDc8.=v11W)QaF%&BQ r3>alss5,%H#霕*`+=cN*/ lA(& ΖP yf$r_G)uam%#rҲrGEyI䐥v_Is0%sÌVf#s"18>ʬƜS4\a:O lI(a4ܗl/…Rw@"_ (hTՁF vfӟ,r֭~-#LdfF= ^ eS6)ق(dfV[Hw6'9 KՌ:ZCd4iqJXEQ06Qc$6+)4c]xL)Odu-.aXXv,FTԸr[$6jT$95H W[h{짪1"Tr JrVW`Zft>|L E{8C#bD0AFPw{G48R (u㳰mwBzqW(Ȳ@*`|e/Hۻ!.Z'l)N.'n+MKJH<83/}nh.:|n,im#_/nk4SӅ4swTZzqalav5)}*xpQ~Ai6qg q\A5i`ݙ%pW-狋m<LRaQi<<^]чP;^g7/73ys`7#xPeqY'QY $X*]-QADaM*`<NtkR4J8DxuQͨq!$)3kZ"#58VJ{wƏQ!hSPLSY.̀RpCoD6z<պ^sf .k7+a S.+p`o- $ˎ:cpQ88agJK)ƌk ,dHĻ $25V gcՌkOxc A#CI}WRQl=aƣ,n. S"!_x[RdzFv,5jlgð,8Gc؟1"40>:X[$WU@܅1JfC.4ڴ=BA>lF.OxRǞqK]OqF3W GZP;ao&wmWilֹޕ0@=(+}{7 " 8W"N-.1Xn{3I^HO%\BA^eŰxbR/-1p&#[430훛fGBE]}Yی]6|LLBm~Dӝ_W.5ҢvkPWFhnda_0]͹3)?eeJhTIx ,C.P}p<1ƀ !b&(ĝ(}_iW`~qN>F I)tK??s|7d~tfTo$yh)LuY2I)88H=Uhk 圐/<5!c#Yj@ Z0ԙHkXeAh9 DJ~O5 vGsQȟ<,mF>ޡ==WerY 17!FcޒqD걻P|֊dvVB]Z$:+XNn9;}յ0Ee~{ntԘN`Wx-ѤS-wP#i,-i+{٦ֿd13@)ƷN#qu *wRS`R9+= 0IxO*Ħ5Da`\B *Bx3h_.*0N:?"xaim+XQD1"quĂc@"_~La)$O8n`?Gft:^Dݵ!{U7}+[m˭m'>nD}ϼ"Ji v ?L9 ͸w#6tXþ㨼2Matf w<e+e<^Pa\ E7gJ*tʘwmV?y*Ou|[^N H؎^` 8d]#0w t Qj E|+zcͶ輫AUui=> !TX`]["m$Hh7͝=CpV<0+T0*׽¹RqXgM20ߘ2O/x>q~-+ϥ)Ivl(iU;([[v%B ʘI $ul`|h ֻ:TwaEScC3p9!|am5KCG, 6M?P2}ݟ |@Вl㭜<0+AABQ W3p&AB+ lgC*_^`6Zw %c- X^&+ۻ/nX. LuA]]yʯb¼8eRur>g bOhV='(Dwx"|j2{jܗ_;NY}H]rJPH [*MnފvH8mɛS׎B+}vWQ?}ĉJE "`&l? 005I\VRw7*fq gp˂By|9"חЊGBy3Ezz H}e MQƙ2Zu2'q >rl@OY$R\xMIc晌W7ոٞZ솏Tgr`usVZ¥)*"G0&v Rei &nBWz5|Whe,T#c=pP3'qA.H,*Z/:w?ʜjdObu\NYfi$q:֍Iu71_ ϹI&F^SdXTSER=._> ˾dv}1bw*|buAb\ή\xLq±#N )y46.UobXۛBR)$5y SziۓDz/JNI}aic?R7Y(w⎪q-|Γϑ,O@a;ڈI#xj7k ƌZ兪. 8k#" QX.'f=~ݟ0,&#0mS⫿EEA~6"xaiBz "bD%'R#e3G78#R* E?wG΢dzUƮ{UW30\Eװ(<"Z{()K8't4X&) thhE/0Ѷ(aaFqT^hc&E0:3q?cY =&hVxm%74@Ӯ!?>{8Нǐ{hxzLv:8]} V cBu]FםyխHu}%l4 Qݸ`gNOF\Ti}>9Fï?fif~g1/Wӆ~H<+luVwm1IIzhgA80_GcOh?Ii@;%{Ѐ/8?$wTY_caoLRU 0ό&]cI>$J+dl@edvQTXiO=c}k5 im iܵJ$[Q8Wޞs$`Xt]MJQy-0xL#0:^qdCTS 6FݥוYf*bR=n杻aRu-C\3»s3Vy^bu~u3`lӓY&`F y~u楥Dܷ{0S}+X#ǭTY|z*=1v$\g`~c"rѝpl8jԅȔ.qM&J9DWLPnhUd@b/Lp;hon @#SU"$bYb ε)Y> '' k.9"{&'$qJpeJ9K_pS+Yͧ$诋%* /0S VELʻY<2 i\\XB*歅Z@{ޟ{´oc~]]7Cquqzu2|]^u1-rՆQ"η1}cKuJ\F,t=bJ0WbJW}ʊ־^k@k>  uJa# 5ɢǜ<\͋s;grY9M'BN &dQ/Yݱt*6W}Q)1G&mftg5H:'U_q0uTL*U V1eP0HMhΐH2J8ξh[A|Ѷw߲|,שVF!ӊcc7~e/Rvz"n |QXF@Q"\g)/.Jœؕ&mEDg2`h$N3 ΅1%hwUmUm9vrv宱X5jd*ppeϊW),opJfõ;O8P?w$sd~f!;]qۦP?67!Hl!y:Ӛ+&6<IxE1XC<2˟M)xRI˜gmۆg7kp!kV zKe$GW#8{prMea)늦+?#ϸNYR0߅jS{Dž\G,.ɿAlDA (s>y¬jsT-Dsۄ6 , {jd֝A[ ] kd—Rɞ5yjmbS-DW#G\x"x%T 4‚PVK( ƆsPj&-,ɅL2pl"EM ^sB7&#ZP>*${' phRVf+!zC9WސݤߘX_U7uV^7tFI\ &·;!49sTFFouOb:?˯1 aGHaSQH1*2kax܎.c.16VubvDZ3qt'}ggooBKP,nhn(:7t)ci1%.W:L Rw\-ࣈQ)l2zX~slM*f} /ϰ\jfl)FLuU x)셚zէe ۽OQhЏK^=K^=WWЈ TIQ4An %gLAYbrX!D8 jn+|%S8*w<_ٚMX%@[-f!B5BIO=q+`?»O%R>3Xϫjd$^rĆv`}PH$3MIRr6F N296|ma^HzD`c6972:6Lpڊ-Pl{1;@bPWE"Rj`Cd[Y kVlU%5M oejI% "jVgm{ZM y -%|®7Rj#+"8C+X |۶S`{̚MO4a(ecTLE`M7r]ԾOEaUCB6ow qRː,F%;eFěd>CQ{ *d_~=[Y^,Xm+ Jcfi1U+-xֲ2|Hh" +Sg[mr$8ؤB 彲9Eۙ|Rעe[qHf#+P@C~4tuY{?Hrf\vvbpaLRQ>; 1c*ժ{*R2udr6@L%4iߞI c`F5 l*(RRsuzU ˌYf2@Y-5ŬsKJ28XO˓An*V*M$ٸJl6N5nLlU*bc%1su*,y격"p*xw>+V^4Ƅl̮5P*|ee_y+:dd7=@W yIaj5C^3c;igVwMT7m!#cT|۞`>,$ VͫPDIJ%>izb uؕc݄3ғ XQߔʮWQyT/Sѻd5EVvW^zjkB45&Rfj#Tw1MKNТ*)B 禺 4)4'P^#5_]p mQV*nPY{n{rޛޘ>{{xreg^jt9s;d;#u]D !QŻpG_L[X&m})RFggUHpLZl"k+=*/jYwoȖԘU$*Tw\ *ۗ`̈"i^C?ܵee JΝQ YC%n 8y%p_8mƱ-6b(0Bү=Ϭu8[¼=vn.{؍G>󕮽Wvx7ɫg'o~Xgg'ϟ=s)>u2Z?{‡Y9y~xVήѼ&2抲1#?_/^j{?W6 4рv|#e }&xͣ%y!M)@ci߽X ]0S$ѤH IуBbl^k(=ٍ+Ǩy{cU 0Q`a#[$s@BmzȍnYq?aGpyLcٚ>OT +.wWeo,>͂w3V"ζl =N~@,:9+xnċ>SLR2&^Ǭ)w=fm 07,<)a9W~^W*h{aQC2H8r˨sH6(,=Zt|đ:w(u 9[7^kU~B^žZ?&jcy@\ -ЇdIN@<2[0*9w xvtNRbg8_]$pg#$An5@īFB)M%@j[VV1֑l9`^ so4ܛ?;+|1\Wr.hS-wH(5زa$'en!fc E`ݘ<;hw̽COB.Y>Xm'jP{_- okڌjC}̛0/e]v 8VeZ*0O6H<2/&#?_3$x:KGVւգwҜ׈>d_(_K9_vwߊ7Ջȼck^[ ,{ ,+vUN9@NԪ=G&Wv(3$źQy׋,I+r_ꡒckf-{{HɺlbRz_h *.¦b(Y^|dXW.L1 DisrvtqA{'V@{vu |kG-GC 01Ԟ"F)dwNi쇰bn&b$;:fhގ7wWvtޑȵ3g{02dEEsUu;K"ګ]f<@  edr)UB^ (j4c1FcLrjv=ޖMDTs\dдb]pG]rYf;jw.?VF(ZcR~b OJDRz%*̭aEzk$$5䜞R!z1.VZts,5@ar {Y|ͧ{?ּݲh_z2?C2rzQAO}Rڛ>+r<"ś SN>듷w uӷx4trBǣ}QU';8'GzٛT~0ssӘE㱏VX:`հx+1rvX3W?8m^YH%vz`BW5-xK yD$sbP7}u342.Vq-M(EAVGA%'l f˱KɷKFY-&XH:F$r\aj6og#!_<:Wqu[PIt?ՐzQGoz\|+k:MQ94CxLd<Μ ^;]65m ?gTm}<]c+??]CMa M.fO*{Ynj+^X.͑"2 H"T/m3mL6]T["m`2,RՔ L[k n&7Xw=aU zOX{ªaOحv-L:Ene'VǑWP&HI%ە%:&\bFWlPSVKVdh$$4^M 6CvQcBǜ"!{`F#=<_Hx;3WLJ;4Bfn}<ط0%>c7ދ~7<ޛ9?7>4ϰsZQF֟yzFv?026A@xI>]ك ^~s{MzjJt@P14BхN#״py^ pL_^^__ݟ#гͧj .rESP㞎xAԠj] .;K3ƪ[2Bc 0ybLϛm'Lڴzqu'zRK ;AOtiG2.o#[&׃Od~,tMYFuˬɇ_rybGAϓYA|a_Cp&n}3șIg/^[/i5T##EҤUZ2YӶbjYsSvMXr+9UkɺC\mjlb!>*Eǡ<ht7@Ix6F^#',`͠Sx6c Mڴz 6( ]9M^̢!60x) .W4cat$_:>sgԾAbz?|v( fʟ?@tF`hmZUȦW P uJ^4IZ–}90.)sDb(i%GwֆBsϻ .94Bͮww qq]I"Ac ڭ*:n{J Uyv)>f?3z1Dj؀,~Ei SP9(O. GOn&G t魐 -gSC_놨B25X;L'\28Pխ-ɪbTbN[4]-W"XjYD,6^K_(5E]9p{_d|DEkaS28OIT0xRBu&\T7mbYH8>l PL1M\e"I9'.}jεBQSlb*4_.(54Q1Nl y5Lz[?zbsAO:).z_\˸s]orJ$S 3?0됎#PLG$ML| i(Z!j:2@MVK urPRr'%@F.j4΢&C4=~|>eYlڷM%kF(Ea ܬڑʻ"+`Et,dJ֘ۂT/BL> 1yIf_zkz7 =I7՛L@9"T;u?װWEk1+9rM}C;m)6T"0?jSr1*CgŅMGjU8V((FB{"&<8N=kJb83 S]}׮ *Ba>?VD\[uJVV|lr %+n +PVPt,K_2{Aq+EJWCAWe%\NZ,;mdL).Lhl"N#h|p򦄜(XXdidbysFjs}C,g걎Ut+©-vVqBʭVK+XjՔkӀ6Φ5\y j6E}-esH?NuE^M*{sR<: 1>ꮅZ Z벬A96ן>أ|7hYμ,=>:ҝfa>Ne4 mʑ*WAYAX9ˮZ1'۔Wiʇ 2Vg)Y]nBXzgZ /x0BAY[Զ*'Y{u a+ږ\!&!*&6Lp.A u_lb?a^O5*g6A}~ȫzyS2w[Thjpq6`u{{xgP'>6L׋sY p?ox`䏝T4;{M~3 o< ; GŜ{ږ?3e"O5˵>T*pTfkQǝ~ruHsu[K @\}h=GݍY<=o5wg?%?=2lFUO*a "KWjxY#~~1 i9!Kn.E'Ua&Fga?*f5;u_j{M+`HVYi kxIǸT~``ƶўNoxa~[7yyc[b63Ǜxˆ=ɿNmvrd$771'̨Gu:Og^@+hy2 Ba- PZ;-s$X:^]1޻/Xl[@bwu$,/sIr Ɏ8yLXBւyJ:hެ:gGFQ qA~\8B} hMcοzӿN!,v@BpOY;p~m0}7h2L7O m:7&fhkB;D҃~^=6lA>n ֋㵺e}Nw_ηaL?hNR{=~}/D d##{ {=7Oy/d-7ڇٳ̼o}L=Cd:xnjd=#&, ? [DoI{8BQc FdI"'fv^Ɩ mB~Ҙp7[8%@Dmmvؒ5 ssMt=R'"-Mq:ӖnSbD౐\\ R[̀5rG#8E"iq IOoڞ!'e_A#<L?֒r&_mhmrXhYV'q;'ۼI.'4WvKfL4Ջ݁K'Gܵ߭#'b=_{T;23ުz9hJ[ËVq5iw??zgRJ%"sH1Z["_Ft55h$mޕ-q${۵:toZmYiK<Y\!5epE(2ϩVCg}J,SDԽaP`2͉`X(șkN ֈ#)*^W$иIpֶ@sfk$Ik@7g.nP"B\ ˌ-n .Qϯfhnsm37ơ3wGMwB}w=Hl5L3M@jeG4m,6^XtArZG:0r|~M݇װ>Z2I.>/f hʣ69L)Ow.UmQ5\w0CЊ}cϘj*)^}sPeHHH!!ѷJ"zڔHR3 SPН锚\El"b0L' CyqΆ%%%}Nܫ B| 3P"jrF[H@S-'80> kcݳ]{c \/TB 30#^#i_1A߭Q/:N+G8zub[0 }#4^s(,nfqx}/' Ua&:|m$ K15Ws H'蛌,m\s'G6yt zjznx#UNC5 s0dp2J E@%4%{In4:.i..edM&2mkhۆkhV*=g}cI7Za@a_lŨb?}?+rEJp@IG5g-J-xT'gJf.KesC;kYsǘ]wq㒋# ŴPghF0"yC _ {S\C1FӇ<͛swC{cb)9t>Wϭ]klBuRlvٮ`{=c(R ǎ\JncSj_NJ1H * Yk5s]g{vÒ٭qi ]>! Vvm#H^S0BQi.; z0;x3^*s`+C˦L R3}P%ph-wɱ)[@7Ί5odfL9/D;h83S5_sLs Gw0#_wEu1jN>R ?K>0Ҳ:B 拄j撣ꪤ0HmI8hVw^C˟ykH:Bϟ?lɼ A&4RjR:KM h {#g:S1;D[%NUT(f9+\aԳ5(;Ɏ5aw%Cv`b2 HO|~M h+*-얼E;fz0WӃ93>!ys%cI#Ϟ3 qlqA4VFy|no`/SѬs z=rc";^tEOS%f"U@ +ݮi4Me$ϓW-f#珺=:f*NSj& KPN AF&T01& !9CN,k2@jj6Nk׾tJ\_0Ae{cyTEht&-ZxjƓ ?pu@8VLnq;4 faF13]]lpl 2߮#8-;WrȦjHxohr̩: ez> }*%؁hK1T~ ++^ɾU"$C?>֤G}!5#3`Rsq^3q;,&a IVHks'F<{ a\`dd`(nE!,xF~ݗ)o>VoC%u]xa@c I M!UdTX6C |S& ^tp ﳯKBi;ß!Zlj&ps/Q\F'o#hh$AT%XFd@ TS9)Ϝ%#u<q%c8.IcۺGA\SUt&cVhq?,x)~faJ);zghᔼ u3:*Ի%w}yXxq79ooۧwC}7߈fiVeT~{>n~bLScR8e7K4as7 ʹ8=9ybwYrCdn?XxDL`K %qe%!÷cY?ܻ6l ^2!P&ͦF+H7ƳsB9&_>D|e? *~)?Ct1SO4S앾)PcD9ÄyE4gL|qa]Z<.aCk^3F0< ? 5H5-Ԯy[AwCGaJ/R'yu@RBVT=)b^c:"kf՗`pN:-@E YuͬoDx(H9CZ~0ѡs7y`\dȸsІgF g xm=|^:@ xܕlm %r|Fct`(q>4M}춰 E|Qxs]]ߒ,7>hWnȞs kmH /wݡYe ķw&8'Yn {ZfDK\=b߯zH3f8Hm6DqUuWQCፒdp tc?wT8 ρ#+gȒl7 S5˂qjS ȁݞvBri!v(e v<tQrd˜HdG@#Gf&S%Z$p+trhahC HB;$͍|JW*,YȡO4n*vM3g L6s(%ϡ]ኳ<±F8¨#YJ9<"'<3SNsȧZjLG6}b# 9Q#|k聜6ãn &v{ڑSB;a&6O\M 9{Y!~fĬE3"l/o?S !uV^Wy|ܗeE% %Y(hUxk z ,@K|,y2/=59C>uܕ(l ֮U6 Z:eL2XWZdH=$A*ZipkD*#$Ο&rDA[ #,P0eY-/Caҕ%bc|Ygy)wVU^lwAX_1o+L;%} š!DptJOq(Ssd&KEQmQU5"@z%)KP#+$B92Z!!E z 2e6JNqAD)u,#H1(2}i"-bcxYWyK&͕15h)#6~ RRynYɽ+"}rORM9] E~t[hLi_Hl( [P( p4ʂg% { eH-?KV0.$-+BZ:bNXхE%Nd,UQ|լvm1\^θ3/:UZJo{|2P!@@[8ϽiV yQLEF(]*nq(re2 $v 0p( #\Ut%vBvR8:FY`R.,͙1 FU*}R(!4oDSwGP,Q"v kӿb4FRfFu3(YnErSؚ]W9d%}qFZ$]9, +:{NO!(Y.nwiCc*o5W*633i+ђe h} 6 rkAu#&9pS 2XiV{ɷ4Y-f2 =]9z .%]D5W`Aw~V`[qiyԫ tg9D.ut黊_?:>g xR:g谜CwgqvNJ»V'[ٽlsvgr{ 4#qkK\IHɔ;Cfu7)E 4SӾR]o[xvLz|j]#$oUT1K)*PLR1eeRT,Zw5dnaܽ{RCbN|a1.9%-ݝ"9\P(CJp`jSθv䠹=ܲ*'80}rm 73Żuqͨf*!raW $LN24C;7rGߜdXaߔ<ue&eCdHpuLQΘ9}O>-e:[nNdM%yW]\^WBv?ـ rX6\CzƮ5HlNk< a8q1gČBY*5rZ۾)&3lsFbl8F6WյKGϰ-öզ4M/l_oHntɢ?s<ʋ:kfgfk~1J^R_gh 3"sDj8(D*Lt<ťRgDjg4 >hl,|qJR[*O?"֧A1nQ׋ߟN#Hkr76Ș/$:×\W,rvH%75de߄B~ s*QxVZ"ЖSX2묶ܕDi825iJ ͝5'nI=y&4alnl\D60MM9Ϯ%> Uq(=ɈB /( }oÞd#sΌZaXG }u;%Y,r/ZwZVxo%Kqm:*VWN Qp3`⤔t fxYUK+U7ex J"֧A oNjgxť>A2PεV9û q !!{sKbmd sY.ʵ8Ɔcy\c "Ċ(Q?`oV&ͨǽ\I,_|d#rT3(7ˇDfwv$b(xvߪZ,+KmK\B`.?OXKhS4/9>(N !ܖ ph3 4 kE _D1Ԛp*JSi% \?)K;bIܗ,k+)3I6次HWQmR nI{f>V`6ΠΊmKD92}+ZX6mͣErѷiFrG77/0Ԣ>(BJ)L~_/ov#=/L~׋,ޗiHY(Rܞ?⒖WRѷ{(Z`g A~;zP_`sr898[ita @(\ -9*&[w̄`O%^ȒP!wh$sEmyH~-1S?J `۸((0|Mj؁ko_Rna0+՞Y7kF\:zB ;BHFZ-ϥ\:oVoF3VH,GH~*7)kTDη!Ye_q 'D"q5D}Z$Bu9K,N)ከ;՗|st82ysMf^cɚ5{:l~f;BCvHX*͡AaĜَզ&ZݕAwC3+Rrq(@Pw7|u7s=^*j(]QdbĠ*)+ƲuEQkt.kL͚r-16")ѕBh\Ǟ) G;)cO7ĨS~"eMKM+RVS#>c5M^G{}}ѥQ< N7$'um/7lRe$ycQ^_9/_|i̇4_C/9_bv^R3V,|j3OjX%uhQD`t3/K4KԼErN6r;'cTﱽ(Ie&U4~fwH臯 q#MOun;mp;7o ga׾1{޾ z۴S=Ô SV>YOv$]b9K\ƹh9 kTӾ!΂em 5e~7Z-]G"͋IO;akm0_,-Goή?%F$B"5fyғ.~O~sYHo) pz<d糤RX[^~7I )b*.pɅ[)_~}jm'T)5Z~0m`c1я]XBmg:Cw?}~_~y;nt\NOޝdWOQD3thZq41,Q͡_Vx̣2\J]))"_* )hvxQR !O祛 A7-s炲~5uLXC7-=鞃9=w;0N|aF=A-#H"E7I/nQ书>;{jl:Yx_3B6]T{Q?gz_1e;S';/Hb_&0xVb[N$%ɲXJLƑJE;|_*f^m(mm_ȶq CL]s Cy9N=~O_(F~'Wi4\~޼w7^w{#>60?qN|p=̡Be3 > x6V'و0H!JKLRI ZsbI[o( I>}Q~-)І6_X(#8ު?Ҁ9/@OI2Lq|Jh ZG&dAʤ 5! ?_k1rj ~ID2#1d B4J(94 Y!xͤYBj7A`x c \Q NA2{I 4I.8Dm 9 vcv2tu uP aT4"9h,S)iJl "mA"U, uJIBZQG,$iGM-h<5 "ij;pYĂ\J'ж}Ba*%g+4M'99`s]< Ɵ ϼ4[40fl.Qh@Y]J-2]q4&V%ua$ hWrp^ x…d&=9 9=\ 4]~J4Ѭv-Yb4h20/T klΈ`RJ;KsA.8:eo1*j`ࣥ5XDlJ&<#hA1|f`H#6jOֈ(ZjTkI2%{&.uJdva;6y?;;ܿf8Y^ }6/X/CI)Mfd0?ǵw*8 T0\%BtL"eN5;Lʏx ~+z 8nl ;eOKE 'ټ33]Spiw0=%m"FnJp3oBVּp2f] iB!Gh&1QB6N OR`iscv=j hz(c!tKĴoBXvBWD(rި?0@*M27^䋧*Ur"zsL3y2't;+T^W".z0p7ah#s@|m?=LwC_ǧ;4 ōǯ3NPss5Wa`&j!Z78-`Z.5O_WIJ*B$V4ihD+ xdRZ^&u+8sam/,gb _G v;u||;_G7=P\:vN×`K ^߽?7w4pɧrF;PvA ^1 ]1zvOya*s'e2_m^m{a־5AF!Y4 QOO16e+\Р Ηچ|m|geM0'\3 Q%֐?Eދ[0<6/Uu羮 ZSzMaL~+`u`q /~OUWig"! _ŷ_TWcaĉ]8O4v336 X B,'j5:x@Kq& >~4ip7u=͍-fy'^?!^(5F3@@v_MpLX>ake+.c| 'Xbљfl #,74pV@Fژ~';Yzq??fJpb+RWx7_c4qCQe`XlJ*$0_`x~v:9Om)J' z ӊvA&KȀ$7@4Ddt@S!%Jg t۪n B%jy U%,+pCjmTJLI" 5KH|o "%~gREAdX)EFLʥtI&&&'L9Zc ڍ 帒 ͗= f+.*-R#k.#@-soac49Z*XVDv؍9ՠ({dXTrIf ńi* 6EĠ;y>\wo#b*\NO#1<2e&"#F0DS$q֦ę0r *hW)}Tq5wal?SgpF1~&W~DZ*IoW*rD9PՎ4@ Pq qS^5֠W-fz@+NUSa&Е+{Dlhc%v,`J퍔t9]h[. ǜ6/{X6h@p9&̕(kW66$ޫ6Xə>DO[t::^A~3kC왚$N0}&nbPo !GcdC-akhg O[/p{ evO[ϔL0qO7fdG^Lȁn=;'r0beF@˗X*-( 锍Ș3T ߤS"p3NTL?.?s|*4Cs~Ĝ,U<|;(V3M!w-}^%E7teLJ{sCȳџmp)^c= R4ؙ<A@mvw#$Y>^ɷ׭̸ֻfԱ>n83R7qśO^Ѻ^wngӗ9Eokw/ƽ.2c[fuU;xM~ *?I V)APM +4"LqRR,ZIOgٖG+yq[e܁p̔_g"lhվtSFDT x.Ql6<ęrccs/Y?wor? &{O-辔‡3âU|X@ء/|XS2qBGo+U@PC\R1f%$=č7)\r{ 3IC3eZw~S Y>WjveOhqhid3L֮cai N:\DxBR]"Ph=&L\o8DC"P".)9cʣ K.Adf9͜L7Z9յ ! RHyO2Ta0+6Is~]),t[Ae5!ؚs3nʔ/ڤbaΈuAS0LBrZ\ xX%j3t2OwЅ> '3?5x?Wzt5S N?|D4>cuB[%'؞XBNrՌϕRHISCLtS4)%KQ*( 2AGR3+GMaYCZf~ Sosz޶6vd)#\Y)kBtJ9@xdbu$qHJ$e B@3o6Tż]Ϊ,0r1FvVuMbv;~M*qO ͽZ1\;A6Y!2UD.Z $F=^'Г*Vj@z*Z!G&ΊMq6'hJ;B Y,3s$CKy,Z֡"P5U:ZFCf[ ݘ'4h_?- >6޶?>{>ޔl'5+~/mkNmW@<^R2|W^|js"m<*?)zC' iӯ_6/?5/;g=)3/ 9Pz]K+O\/wc_z ݷDw86qo6{r2S+~d+nM6^ƫ6:lݰm<8 cA70pk> F@=3KoH=Tt_0y|}ҕPL~wR km4]%<6;1L8"\7ݯT89^E~y0S,vNJ:P:"2z,lzIPek* {(X6s~/]uN[1,: Z9iLD%*u!LkXTsjjतqMh>GW36iy$/5ü.0FTt*ҾtNBdaw5Ss߶.x%ANIRd'\ q$sV$RP>DLR6";($^gv8NY8e~q3Jrk|c g*i85AX @@]s&9eP@G5_SK ZZ:ù!vbr6.e2@Uqy vNsŅՕ o׏*zek7w K7[޸9=;7q2W혊\Թ;WvvqS`\y'fߙU҅*9TJh]T17W8v1zÏrsd]]~HѺ;~( PvTP'=֖ņ)d*n2u6dl석w;BO _qqh2.mRjmmc0D7@iJ3٬)u8mumI|ȻoޡP|N*#-5l sH]sDǖ(l 8BE/.YzAFAck6"H]>Jr਎B'Jg(QETyeKpJs9ua4\j8rǤY{Ղ9pHmv A?l40uK7CƔF6h|:il˴käWm,:.N>Ap#6Lh6f?KYN4ӱy+qa(R9 Эe:XϪ !)ިN :}k^:V۰JR>;[W v~\ HkFjYͩ?Mho!y–pӗk a΅Oo_V߅wZWW&2ۦu^:^~fp)] _?p^dNM{N64R8}7Kn\f[xfAMQ0Wy  = 5)Uhk㨓/KہG;vAf"NWouf[wNOS >MJn7D}9Os?7=3 ٫Wokpf}}{ t ^:͟q8B v;oaTcK NL^rv3Jk;́I*۟7"Mip^7Y޳U_$p5/p>}`K$nuW݋Ns-f"SϾԤX1GWzyg9t̗2\[t胨߶]/_6NVDreČM"V:^e 'جtKl:ftbl؏Ӌt(4|6[>IOJDr0 ` Y^~:$92K+\ZB_eK#BaLQ67k|Ǩ}ӧ!dBڜOeĊ*l===#@~k돸~%_You` fUb%+K>,)zHKV:䯸^u+j-ÑG)$$g)emR)).D۬h:xMt\ApnE^j3Hs${Lox@ޱ8xda"K!8Q1*1sZ%1d.`%fZJ Ao|=x>ʓ)*(p!Bc{7|Ҕ(5Lχ7/yR g'0;N}X:jcB5D9dHH%_^-w3#u(z(0fah`(({zJ/xRKa/JL=&Zc&>%D"i|1nqBXMVU9XA3DDHw5'zk?,aIGBO ɭ=ݜooR2n]:V`9. ')ʅ?ZRhQ[z x2'H윏%XÌU#61x&Y`,xH5glX,%@-CǀEY+M{hSpy. ]fL Fa0Ҷa)34M5fted#zDa[cm\t! !-/0f5I(5ϧa 0g~LZ4\PD2ԙI7?e”E<u5qg;D|Dc"& f8y9 ,:0μ2&|Z4ʆBK$h]+GXqI݇& ::::/>hWAl/F.H2|HKQ/[DD3DQӳ_aR;$D̘*=UY\t^+dvG(֍\֗rDФ7*b}** 53t38M} 7 >5?O75nzևn!F+ ?id0JL=Y&TK7&Gn$"⽾lAsWJƗ#[>ZlղyubȄ8sD 'I4N,NTE <9-G|`SKez%4 nMnc\-ej2j8{ւŠEp4 `RZ`=eƏVP*T گK@q8M(Fd+LF/_{5%LB[InM\XʘE\ja.O8W+|ϰa ΑESS<\}qdA|aOX.jZjq- MXZ΃k9r^Fؚ8Sj@q".H"8FRDT h*=Оτ f8G3;GLL틚wi`a8Od=5LY ,4ײ[|VTŠ*V:O,Remu<ܸ9e ƫg[:^XT+.ʹl enBף !hxv+L?\f#7ԪqL{xY-Ӯ ^mEӣUُ!r룸ko϶`2)̦͢e1I`Mg'c,Y\ʲu,#8EDqnnYZطn0jE!D,-rͳ eCTsO]X,*=8v2MSV \ylb)`nb Y Y S'm]K%ΥdWZ)nJZ}VJx4/mX2dM lنr&wt:ـr{r'=; FBj^T'.0BS,AV7V|faiIs4fs@"(۰YRW602+hͬ) 7tv:Le$NQp9wK©^ f1so}y=hu,i-{Ư-i2komQʭ^dW$`&<ة0fҤ[GHK ן7'JZnֺl=lW\N%dꩥIb;aA1k.}*Pb?wwˁA 穃5 s{~H`~͸ \$ck%ID%O"fl)() 0)-FJ$Ct gb ka)Ֆ1GIf&d,D|*cO9c/Ż}kn؂ 4D VFan.ṲB3ْ8C~^Wl4>eM3 {Ri@fԶT.!ot;d<L J3A} JL *-w l=aG3^\^[ēW=UX JysW=pyZe 1Zx8PBH\ŐB6KͽyHu .oY JS}_ӶD4".=,pN=U@G2@bhbU-b.y!cϙ!Bl"bKB5 ϧ.v 62أ\X:#03/,'1MñG(":h.91BL+7֩7Z V89uXqc <Si%D8GPa!%[2α:gٻHn#FZ{s O6|bY$bH~ɊfWbUUVKAh;s4LV@ vSjlťTqn'5D!8hա|%ɭOj1}|47TKݯʠYrHoIV Cbp ̗xyPDцcj-9ܦRd]#gw%OWw?, -Sn+~4$䍋hL)Woh7CA[)9S:ɏ<Ϧgj1$䍋hL)zP נJ11h%#MO4W!!o\D[2%3eJ2f=TV'I6I#=xRg X2|>[_g)u${' lgx@%oP9B.~?*Hi)'cfL L]ꁼh4n>P_턈"N4m3F鿥UÕ"c xkD4—ųYJƁ,5>H0&Cf` 2i&gj@)2.s0OuUE*EH+@y< `o$W_x A)eM"do$TDUn^T@!z8(IJ SՕ(Gϭs3A`|i _܁݋K,#EAF0smzIRCF hk㤎2 $; n)1EÍ'rQiЍU@@G6 $hG.r& "e3ʨp)HbQx#(m4h@[r#i1d6A~ꗶ @Z,@tZܪi\'"ESB__ԡU|{5zNtJ%C]5]pMT?|4sBO; ߹&JNtSfɉNeד54~땸SMηR\q~[1@D}`KR#>4Ҕg8▀9;N e((SH"tȈьh0l=B{Nڎ¶Lӧȝ"k+,r9hӌVO/rHe1PoxfL8d>ٸ Lj7&@T= \+&+X ֺB PjtDw)θlh.!X Мw~b]/ R^/y܃v@Y0L2 Mc"#u <h,ŊPTF e$z5C+(}t\h* 8_a'PɌW$=<ʗJ&i%E] JPHCQ4 iiJA+-TW2OQ{"S/ܢ0-epdp"^VʵLNG:/J|$Niu.=]I#H6s݅uxo4 d`{})%-~#C!'LJgDZ%)-b,}vS'[E%F`겻 [a*N1׃_*AhiA1@(v y2'h<;Dt(>t{hy0qP]dI|E TZP?EH,eq1,m2^ ",yol JazV0'dӯ`@8] F(+؜[le8\* _ '*ۯdf>ťێwj.<8=NNYp!^d !!i6:Mnݲ=*$b!)]HBŒ99z*W 9 P̌,rv 6 p'nMY6P|ԿqwMwn[&n}w}?y'o?D.{F Dw_/Oal55߀%\x롖8]tR(J=;.JodFHK~oQg/|]8[&@pc;A2Ҡ2dZ)I4Mhz9gFχx9`B1ؒrsL@꩟s/m r'Ys9?Rc??;+>NzX L߬$#^c zV)y6#x HRp@Fg̛ 0_)G鬱}wR șއ89k{w'EJ'Ywt_/ d0J0m~\_) 1V 9W^ §W쮦5J@$71|I{eC6ӞNOpy9ɷ/ݿl_{{qOpKR::^u?=;xQNX:_keٹ&`dJwaS.vu0.O.SO*48:_tEw? A bY`MHxN?8@~Jjhxs)͍$Ԅ[P6't[yG[{[@%W s#,;"9+Vayٝ]!(h6{wOn2wִqctߝT| w4 E-I 'D*Z~WscQ{\y]%MN䷀!'BԢATV*@FJ6FEDp,0Q:kఌ@ -k~oc<\#ܬoch*} imC_:}/+O\߬E\+r~+=\HEHvZwC&enNCۂ֌f7Sk~,m y"Z"SV @knNCے^pBfnncH"jT}݊CoT zso_7hד˥NC<6V\h cRqTȑ&jDH᢬AJCT ړAx,yJ#_x@t P)9MCvlPIS=4q-)J׿"VJ11h`Dٴ[yL6q-)ڸV@T~ڭ)cv[Ћ@6:aO݊'Ɛ7.-ʬ% 씁5>mrzN&q"$='W7_լ݃')Ya`k{û_Ź; }!YgK&o8'w缡o=;E>G ֶ ZũZ@&t\тcE h>VY͠I1AnLnFs_+ӳBrFQ5F,#&h01"6DFE=Q5M@ rq iT\FS@|1 aA(&Ƌ 'ѡ,_k82`"螎I Wq.&;(6 \))StS~ I6[uysۻТ8"dl>Q3!9lD3?f b2wNܢoy Ļy8ʪ,xATP- /fHgY;eZ'zX5Ϸ 5~R&ᤋ`UsAκC\P&LN+<j<0l7nj뫋ssyYPkA"\smdL6Z51`zuj 2_o܍ I<īgsE >y? ™wpߕ,a=!ׄL~3X'fg~8ѹeJZlȺO}c ab`>ZCVKkX{j`|z?oɾջ}zvv1֭Aayr;9~o1H:]5yn4?YϪ9+1רּ[紘>-j[A\WmiB37/Xۻ)_4+!5 AsZrAVG_#][M끠9 ƼZKVduP7@PܐluLp6ZK_O )ZI 3~kɏ:Ŧi3gLb朓)q/U::"I||![8cRF(IS%r#&E YknH J?^nEjRMmvq%g~= EJ,{ݵ hmN+ jK4¡t榍=8Ch.bPOc)B(2ZBEk Pk1\({H90 Z涻^B2 '%GÅ<( .U(%U _,S!uAu_*-5_󍧾4k2JD=WT\g/߼`g}5!ВAۦeӇ07mR^ǔ̫X]Z{ !ēƅCKZ5TgOaPp$_k9JNa%m mD))Q)OhT | '{oX=0z|RLT)i4~2gr )gs1V4Ҵi.oiZ2wl5OQ6Ȅ|0leNܦl4[Ane-[,]wI#}Q\B'Zr?-)xtUˆVgAtT,:)o0*Mѻ xSu;eƜ?,i7뺄l1sv6wt7&z7Uό{{a9wUTm`"{eL'A(lIS.fS@^}g5 &JjuΚݯW Q0+cPm M|ԛ$pYB7f_93w]۫dOQɁGԨ8'=d97nQw^6-FĒ&B9#[7SZ y{5Zī8c%yDTUjad|l .4@x)5O˜$Jc}}PO t؛\]gENbeB틩i3XmI(3'kDA̼ |_7w)iLe}1mhlʸgj?g:ro3v 4 =+Uu5ʑ 2V t2 urFҧnõwv,o 켲w7/ z~~;@A,8hc^Ƈ#{v<;= @ &=r'$a8&aDMHi),RF JPbC 2Xi'딋< Hi, },h8/COVk%HaM.{a]eztf8{g3r1p_IloJ#Q],VBH?z DAD1gJQBeǖQ2aE$0 P/P5D4V~w@]OÈ:hg"EF2b%3 1)Pbn<G1d?ZVܥnM*wUaZ(:ٜS_bu[ӆb7rddO<@\#"zrk+xӳ+k[*EZB;Ϳ]E?3Yj1{!zr>D ʜBq-gpK.>т"]Qp/}[EDq).&6}VYᯛ?{ԃ||ڸlg^j9j?g˔7?uxHVF #(&B Z0 !!E$&[C-XDV͔ k1KDs{F n"bJ)фrT$Jh-v ΁cQ%re*D_;q0G x ZqPycM<_r&DR!J%A0U~g'Fa” #-K0GCK#4 YTʗE Ճȇ'X \K.c ,!sUpDXa1 C?HX%EΎ<ˆG-U9 5*>+[=!"7 3 \/ǰ-OCTjsT% ?+ w\P/r;<[bdiggI.;XZ=v ω,̄ՌY T.Y QbL 7;rY/-ɶkđ[>u0{?|xXɺx%s(~Y!3(b![RDclнsGX , H8Lp?L7dNoj?lps{y}hFƨ{YDLm+~w#[ۛcY*1٬^AԪ}yj;s{7>L;cDB[C7Řأ9f=f3B6YL IZbi@_fr-/!YW4=:U76Xt $4<&@ ij b9 zqBms/ҮqY]啺.q!ܤ]쵙.VEV K[T. /E|*]|GU'\ӈEMpTA.B3g  {O*?ۙ GԢ]LPUMB/Qyn) upq27KeGʹgS|zTLJa'a$2i Tb Pɘ&&0k }W>=߇RhM OU]87Qpn棓p10uLoOed]YKӋeJ# hq( Cmu`,BZ7cl5sze;~;$ xySWc_ZQƐF>{*_ӡ @N;xuW`o1Y9}֚܁,< ^ 9?P"Y2B)a5%$B!%0D: FXDPvвPTa;k@\Mf ><my7e2qnn ,3q {8`;v>7Z ^͙;5-5Ҥ>*1Қ3ԆԪhU$>mj->_Σ"YT\|9zb(1.!N,!2 P,IX#!Ne [-EXm~UٟmWXV(;8a4&8"I1dI=b1tcR`#E6`W mgG>L'n"}tX&OG䡻|Ry8B` OL>2 ~QFN% bSx hWzqư<3V a gkRDH\7U~.noS1?.>CA&Yƽ$]$ƾu+FүH AL(Q x! 2WHy2RH͠(9EPe+d`FHؙK+M.gq_x*%µ{H0K}r&ojݟJ@UL5UzBtFp Qs:nXFGe{C=88#V A gvm]M[Om*-;ى-wW0ѐQ+ >"\&2KZ9tJRG3aBB`HL5*b fJL,8s[J|yNwB.RGgjW~)IT,T8a\S,1ȄI`X' ~tx ݚֻʹ 7bƓyp>Ff;-pᶰ"p&s  tUe(HnR$\ B#-74;9魖#qZsj0:1w*[98 V{RZ*Nk,8% <X&ν$1x M(JjV`gښ6_aekJsJ[OUj7'$aw oŲ?=($(0Z+Nbgzz{r}WNsK%U ,6-fb=+FT\—v17Eb+tp;l4Hҥf'`+o-+c[J~[[d.K08qAe̽rһ4]b4͚ބX* ۩,Նn8;gQ/uC;.&|>Zy*ݺExJIX% |u|D$!LznބJ.8;gQ/"}ҍi.A t;?i%m6t&Uu!9zV?,GpX#B$Q&ĥ$傦7G~6n2Ng'"PIDAMi:!%P kN%\8KF(pfSoT%{/EX5;5ҳOє\5{,é= $.aHU0lS㄁aGEOKy$xfMUK1;?NW5vu7|(FDi,3Y(/MÉ#}~OjV`o޶]%;N +!^^kb)K$Wװbv)LbcM80ʂL>(3D3e0G(cE !x*AQ4& ",02G_zMS)ghƂ*rpth:Z',I Q6W(24@1`=iPd1n`ꈳTb,X#G1W}e4Jp0 ,T&D7IPVDJҩ D` G[gdgY F^5 j^3/@j6 :ELAG s$Y/֗NgZ^oai4^?_N@/ۏnU)pZ`\o?sc%oFطϦiCdV fl WjS~c˛bsu_3֧hꎳE909أ;D͐" JT5<7idP|SƵVbPZCL ٳs^^M`S }TQe>DZnkXU Z4e,pR#L >TfNp')?, @}`=xFnWgU#"d[6LkueҮ":O==ӖaC/9)D"Io[tdcj<JqWv/ȤfP\y8%l糱] oz)ۃV0Fwe{Jq Ewz'Ecղ.[a`:P2ǚO]#A _w"Fzo)U5L z-+bٿM@rU *Z'աۼN%'=Dv$ =x0C8C`hϜ3o(HS~KD([U {m4+/1D7Yw͟romqġ=L:Ҧ6kwܜ1"?a{PUݛߒ%< LV*}i/eiWI(?7xOl2YOmPdz7 A3E_1yդm2~;Y7O%q y<M=(``a? v;{}hbP'=k-N hm *#l~gCyqu!jd{cpA2%c[tM!B'MWJs9Y' 8(o yIom#wW㴇|LjTɷe]hj!P0^ӖaYzWT8;č|hXv% (mѪ%-_PLΝ8?H Rp8嘠Ѕ\-nTNsbR^mҙD b@a 2YK(\`|A $ 몼fdc8q ur2{q(8}XGJ&&Z*/BG (oW"~8Z./ ¦([TT\q~c & 8 }^) 3Lp+S96fl$EAJPY;r{t?N<TfMVkvp.8Yt;?i@A N~Zn=qDrx/d0o}@ODV/cn'=]c9)ߍgq4bPO&g:2a8 P<Xll %͝I%,NJFdGOA+\1aj(+hWP-Iнi x]+0O8 Q]f}HhU`VjqjYv#szi~y4 [#*(PظYHr]&'mq6j~wN1Xnw,imKx̓Jp6)fIA,pSy^1BxRZrQ\-9W"6{y?GcomyavgnPaF (嶒tËۣZ^4<}\-": %ߜo?%H֋o1(l7_xOU!`G[a2/!" i&r wWlX~+*(|X n.!;hh:Z޻ά_Sc`+K=(r [<ں9?s35D0Y{b gyyt&/T={sЬD~7\  kU8U#:DRMs֯Մ&<<^p`@'gu`<=ѝn.R{^JR&I{r'{Oq_A)ȹUɒ]/?)SM4ʊSYX`D/"n_b=CUw'>JWch\JìaۗL}3HR$@7G[o=,~e9V)U~8XtUBU)M4ﮁ㺏? 8 Wh\ZkJS)>ͪ9}#_l7X^j/]=\TKe#9gˊѵLIsg0dJ6?hTygC>kQY#'޺upt }Ra4^b5jKV="9(8giyZE{VF5, ]p"FC؈y @twN8nʛSa+4v퉋|5l`?ͪ;jqNu ,9\ -,w7GptW1jD!D{\A6^~ ((F8O?SB JqMVz9ݗ>M7Gk^gӚbhw]ւgB&ʁ25VU-G@qYy׷?XJgI$N/ _~wC9\~C9y8bRóxVĻR)eP,pm(&AXD EI\x,u p= q#SKTk; =sR[%|x$ ^E|o#mDr7ˡDdgJ6~۠IM!ŀ !(88k{C e @,474( D[ם"Kg▶_N$n؅u\Գ3QD$#=ݐdTYr4ZCuO;P;0KTspIǎúnF0X0xt< at;~f4#~Qwh:[ Ph*Ӂ2G8zݛ4I[? , @1;X@IīWkuoa}T1W0oWo|\Wj&}$|,EmHHVS.W<߫1Y{<Y55h H9!O Kٰc⟕rR7_" tܧ)ms(tqlk]: hOֹ:xehw:-h[9yG+Bj&Zn>^NIQkb"I)FjWG" ARRߒ,2քm#QHTM;ݍ3׉DQ,UziϷْ,֤l.C/#JL|4v]< +?-' tFEߢL}p}tn|^r42im4OcLr'Q!j RAPdHIgOoD)j`DvAs4\ }!IlZVmHOI!c,?ҪKXG٪_1 5,x)Io# ?{O^wo^g/jԊlk6sdN7YN¢c{ɤ[m%sl\R-8E,پ\%D&Lo΂BnQ~Oi@C܊aٺxfmE_]NwzhnU^VOv2O GyybOm}*bpν1[R֎19 Sp%򜮜 Bӕ풙[LVz؅9'K9.9oyK2s?;`Y2]KlJvvY%O {ګxso0zɆ9)=oyKSS'|uEw\4ـ|'yx%M`$l^hfw\6gH ͟n6(jrX_u  D@/d[CN,}E|9)wbMA7L(eWMH>z.*$nMAz%lOHK{5Qu7k%F[t: Tbyk>f-cǘLJd)I؂YU~-2 ѥN_T7Ud_TV` Q bB -V9hl`31mja0Qp6^ 'Fk#9l%{V,]Ŭ{_: dc<^0r/,2k83y_k1lSwd B ;Q>0?u :hZVoV |Mz% rT^û4)Wٽ2}WH9qM8uuΙ'$"2<+0=꧹JB )geŽB(e"-jEe/SL@;4qA4 3F(~ƼEl57a>AFD~=-R|[8Ѧ"c:4Fr%Js.IV&8"M.SG蜗yo ,`e"`CHw'ENY_8HiTa)2$5$N H/Ms 7 U%` tfkPoMʬXRLΧlxW®녣RsY^|2Y&J2nn<*EUC> (ǐyBKQC> _g< y<٤ |Y(4~8J Qp>2vlaq8~M'IQ-6J{D,rQfqw7xΗn59$u7Jmhl_/C@l<Yf:a\tH3/ٓ^؂Ǡv;/LJKLDb*{SbkѺԓ'N.*z뱪gt3&j]ؑ^'1ᐐġqg³FߣQ8X00w`:9;,g]4\+D>C/ݍ}T꽅7)UWǰzb.G/ a=c]؉+n)Կ]}\JRgfQE \ 1z $2FB!hkp!y5 '˯An1m@ jpəTA-Jc0֊tǥY>-Vs0Z%{M(WG~V5w;~&߅ރ\K}k*o&S{ھcyBuzEl$Ql#lϐ}Fqvo/ :c4OW#q('իU=]kDXIJ.%V.V׍LLRIi{Bq2\FY)֚yMDeq>H(kѺƙem͢'ܫiY0קbV)V.^LS%h6k۠c:fL+pKɅpH77+P`fu'Ey!.' n*|\ZsDbQkYio+ ;vOӆ:5R7]Dǩs>"LR (:&"Ltծqm0]y0$O՜)!p1;2'BÎjɩDbF(DiTnL9@E|(cW/ۀvI8Eczx8>HoTOp%!ZH r{evy%az->g6|$Lbqgo:!*;k*0² * "Oӻ`w:7F9=C3&aw%lE((F>:q<.ϭ 򵎆Ww)dWv{fqn\k&4l3J4X#2V1ygy/sq^`+_ege4;.ƚ0UpkqGu~i wqpf1Af2r(Qݞ[l$%eMRK42,+(W,W i V\ 3h8K?Š5Δ֐޿HrU\Ă.͝trM,eR_L7;Xp kVߖ!ݥ|çr.ےv !qH{b9뗂 qJAҳP=y|A. 2u9_ʖ._* q8#@W!$BҳHP$Bo!fI̅^bGV$jGïVA+`08*$.VapTH $۶AH2$(m8ui'*\hvgglB)oGoj+%bClD zr4AoC2t ""hr789;i?")@f L)VTK3TpDC9,Я844Cp?XRKެ`*wi}p{q0 w a˦IWWA20 呔_?QCFn:t n0v0=r0g+m8pV ɧڦSwj~<0VA(gJBf!L>wr*2A RMPfÂU1uϩ9Ŵ ^1/Є"5i r289VEZh\`ۑ a8r$%&g3&Wۻ?,WcE+Rs$@Z l,124)'ybϓKرIe(E0>Z/Ca~[Y.b3^.KȎV(*A1b 2+EuD=wZaF D+s#/?=T38 +"!V'h\ؠbBT]rBVk/)q=5ZLBLD:cNw9zЍA%dhzf w󺸖uC@k۳mߥ5^4Ȫ-5yxœ1p9C8¹ͫ_B`j1fg#S5s1zAVI?tJQѬ~*9Fo ;l۫e5Sq|ycx2hJ% I3MUhJz:U&J67P|l5ɮzx,Sර8\;Z;Osʃ:,F[ ^սX=h%:L4'ҍ`S lxOսX= B9zC'd YjdrڰTbtbJt&`Eu*!EMAdD W7}@ ^ƴ L;eX`3DSaX+N$5,O1!9*(bD0gZCQ9!(Ll&l G2lX^zpFLXˤ͜`?C2P#dyl7`.2R2R)4.ieƂS 0ZeU~Ǯ:}Jt|iZ+Z>ii'Uٖޏs=1qք-KS^/!喳adf&]<5Op([2Džyʕ>/u{fo ^~(e\+_5o>n%x >y>/H?rn`7pڣ)pz5<5s\bl\i[ۗfk.ɚ+nc^z>7^(m}|և>I˰ Thzl%e\WlLFR+ӿVϒ,7t+be=+SЕ^:Pt$ y&bS;|xപnǑz'3"JP|3!-l>9|''ǰWnmJض;6&ZjڻvSR`dR%u^C䏫xrX#'7BINӶvd!RBxh|+FvҀp']wj4~LP:rJdaO2s[ d,q.w2.+&rp[[agFCBS=MK@!b{9V0;ZtQ Q8j4i: c6E^UaHp.(XzKoV UksKZp3* ^E!({<Q@'،[9j<\Rd.RGb$fϦ6?lj󳩛6/TSr 5Vg&5?F0׶G,J&YQfLJx/ ZCgԹYFCv90^iWjByf1%H*b%5T\IH#ss ,g[R#-ƫ@+1F]5uD 9J{& $#R)7iXdZN%0F:fjbm3v&j]tAX86pQa q1:]s97Ɗ{J&%Gl(30 L.:[ٵ8I[Wtڹb̬ p!ۻ XwxmsE@FJF潷PA]nAB``BpA1:Mij- Utߪs[Q{bL qu~.7X,ʾ~]gF󂣛78@e+[u1 "UVbdz,$LqsU&9fZ>,%l~^HSD'Dg4IMV3JmKW&w߼ԛ/; ˸\^< N}?p8!LUɇ5C¾JO*;1Q9yڧORCDlNvL:>zZ<ѠH:ҰK̘v77ۀ<63]z!ˀ덍o6ۻLCl8@:1ȕ" Q /!`j Uw|}t1 ,!FxATu(Tm(=C1Uޔ'#}nl@{ګ9@Ƕ.T±jNk\/8 :90 `4ʍ)9q> b$(Co;^A1ʒy4hjΣ;ʒPeʕ 9S89MUJ(8 R.2 ˈ̷$AA%omĘ3,)ƚ_x9DU#aafFV4ߵ1R_ov2R==շ"3ck>|YK[,fWwf>Q-)\L[R% #ܟ]4xPژO/GuC*+~Q&?F!u'L#` _fjFݘ_'&(U(&Ub~T(hc"%չbQ; oـWw 5x"Gݩ Fm^ ɓ1lyݱ塇:0?';/:rW}v(RbQ|̟tB{;׈y+.?}ndu_ӛGseZU^^qZ=@ʮAjsQh<9Vr BD@׀ekwETr8%݅'H ӄ L``?2TL1i3 3 a&58^e [<}xQzy֩ WB:?苩#\)`: D(=E.X!s=e]HEM@*43!fLΩb񵸾8Oz!dΩ*B# va6F_F, E+ERR9$4S0 : !ӈ(Rʀ=P`ܱEd9V8CHHR95TPS EAbq^ rۋ{WVjU7GmVZ_¥RClDB`6'L,\J2u }}8DJw@rN=ػ6ndWXz h]'ۻuǮK0@C\l?!% )Rp.e&)Ǣ@Fӗ"_.*Lt, s4%=!PG],b8#?_{1x¢S,I-ņ)y<^Ec'Z[5oײAʮ`|$ctdWD^+×@RS*ݏoE~ț@S%:-ݺDr܍§˫Ooq-sLrȌ1ʥ(-hcfUOX)Ӥڻ;je n`QHWIn5T*Q@(a2(" өX]|G !|oQCqͱ\|/FNC DR*Q7KbTt`Q v`Q vUKGm>/h 8>R0ę@q(`(Hr5j^tE7fGxXg ڝg[GKU8/]ɷco#$wd6*״zpgPRS8;lg|5y܇/+!Tk# 'h(^̏(^̏Um89ͽ(39M[{]0ܛ= p,TdmtFڨK3"n^o"FV^0[ShM55ԨjMU٫1Ll,QVRC=ұF,Udan:Q#an:¬zw4Uݻ _{>j0PQ)w79C6RnLmg`!f^4{hӡJHMj8Ww"AAz)m*5Z6y@36xWmcucr٠Ȳ񀘗Yq5VG֑ǭ HJ^-'x.J]\xK-R9sVxx*Y@cWL;6&@@c\[4H_.yzT5tm,fs1?M\((ȓQyz7@[h11Hpٛ+yx\==(/|:1pȒi?~~ (i|{xgoQ_t})>C@fjK¿gOoP\6g[ ySW[LVn-¥9_\ :dQ7:>=5!M}K zeΓҌ\׈wh^Hd^p븠H\qA0.~ʮv\$~lZP ZXL.E/`\ʀvsάt:GA%96@Q߯ۖD:J"HaiOrpKe`:؈6u5͗sSSk 5 #j J^9Ѿ RPeiADbLX¨T PhƉQ'&=S{3bw^TumX ygC. !Юñ ɭbƫ;G4͕E6ף+m2.4 Swf JaL3n<(D@mJ CI6 3x2*/-&E'V2$P232sJ/W_\ta~/\#-XT6E w }Sn-J&Ubr.a\dC)j'TM(7+H핷.fEZY*W1̈-;rC`h l$_";VVPz{#%iփOuj\WJѷټ~|hHA*UR%%Jv_>6!ѣb Z tRkBt+@U]`$@&,q_/{8E]%HsJ&hӎngsk 7\\Bk}-.@tMߧX @ )I[5C#m5q>bEPf}rQեZSuY_ $^N :"JhQXZМ;2K =/ARu㾢fV [̶Eobcлŏrlul8O쩘3Rx[betX6[L~-9ċ'N?L t82S@Vg|,}e%׳<"Ix~#M~铽Rm=޻˟%mmNܠO۲):Vu:'UuY.Wx َ9^!z@TN[oi)6ɀlN%f'3%(ɦ8U2TYK}oP:]v"]kM!H[20v_ʓ2D\sdk] j5:N@JM8;Ơ79 TV^McR˨ ,o׷ ?E,!TXwu}[VÓ;+G' >?ojp̧1xR},ן}\if{=n1[\/W95$/}@_ k;,tDسnoKg-ܬM*鑽DX./ V1;[P9h;;fXY 2ⅰHUㆺ8:nH9XĪ*" |A80huܫQTӣGQMjڨC ; |![ƊEnS\ B/P',+u4Q,Ib))GxRU7HO.d,| 夶=P$3eco بQ.VpXBwJ+Y( C Hp)gtM7DPrƍk뺾 3N)ˌ*3$-C˂>QN\(7D+ZQe^UAd[u,ju(Q^iz>*0V/kÍډ{q;!!ٿݤ<qeJ#۩UCcz1ۑJ3mݯ cC1K'݃2gڼٿ WCÙg͠uL7 G{v A e"rH$cf_/kћ7)ʫIߡNp#fL5f΋C94f -7T䝲C6B=]ew/;吆;c0X;5h81#검Xho)kOb֢b_bcIOfR௉88G DNrkC`׹YAZ#!3Łqep%~I {ecjAMnPAL!fg[f>3OLOǹßn\1ʊ~<)~m̃wKmDα<.My0I`N(Q1-CJ.()kڞ\.'W ]xm}qvjȋ$|5Ab,$$mmk2YvJ(і%ܐYo^z$5SAuVKT x h ㌦J`< Vݠ:mVF:mmZ+O۴m " 1Ԡ@F݃ۢli~^!sȱx8sd\7#&ŸGU9%ݼqcLյ׀*2:7;aeח7vq X ;| 0=eEp/QXu j$ZoU>]Դt`u$ek䟝/߰&R?tE#JZQ˔> 6(Bkg%$Vr2}M5^]Z LgU֟i=o'N>f-3!y\nMHڐ@>7 TWWo*U(gpPճX\Ge%z{+ 0MHׄļEh(XIMD7BO°~%=pa,syƈu`3]x46e=vq*)k)VJa:CP Wt{i3OA^y4 D v̱[-e$R{v/g:Z,h qiY30 slEk 2gjh:zPCOOjkA+BsRz:fv+;9UtIvt:jA"4ٻF,+S}1P3jÀ6labU-jRMb(.̤z(mXιqXnP辢T𞮳UÂbbURBճ#5aaZKW.GvQy],-{~\nn]Ӓ]g]AQs&WCK\$'EYDt\_9 >-VpbGS(m2ŽLBzx51'*E hP{F^ Zd!F8PӐT7^_ iv 9lwA+%#5l$iE&&E4ih:j \ ;'#hY  =N3kPpq.5@7w _TG%# &1۬]j9@e' *Gj:~Y xp- lԉHy&)}Sz3%tZH]dH?E /E[(ଆu[DEhũRR'^<5)~Q~ SU[l>"0XaVਸ਼ }HJzqH`Fh+TJuvL9[$EqSTOgl7ꔠV"}Bi܍xy ~\l#)aMZU"o2S* rhACI9^\(w<;c1 ͐wzP$4"ÓSA1|6!V }1E+vڈ;qFjs\eGEk(T{ahI>qgq.zk!4Dd 9S0AWrq ^6GYC#mgO !|ZTN{yV$LuV!b`bB G cknnȱQښP,=*VL$!pkp8 kAADRc$PUz]OGn ~oC0xBPDC"D_o%s7s#2 iHq)zܛ#Dy^2.Q~)*U$D\Kh.e^`˦ %?ط(M.5Vt]=%j]( !Hg A@; lfSΤ 9[CMd u)"%B?ۊܡ$v9s~&;/ٲ+'щS_C&'}$8>6#" LxVs7 +!WH(&0YKX]av2覂_k#Zgx@xV":I,էH2 fL 'f*]ߐS k‹`u0c*. ޘ9Ǥu "8,*G*$hNQ)Ӻ*!uH5%cg b":{no,8Ywf*Mn/#}%S3P1-|;8 h6(2S^_XM=*jrUId("c(d A`p# cpHbX4Q)>\D )0R_`7kiuѣܰ^欮t&)RD&&o/ vN湪$&8ㇰ])<Ċ슕G{1W"t!!["P'$c 9{˲8DDqym6D0 l(6RegapI)wW˺lCW7OJ׵Nx$t/,qyIݻTC/q'S ,=JQ(Κ{gWK Ț2Q7JS⯌MJ$īDt6V5隞69ae׿t0"^{FV1A9S٫:OZ:{G_"-Łwn0[߹K7O]~>C/?,_<3WUnkه*^VA*#&pűɗj0?k$h 88 㼵^Md^yvЎ9y]u(/Ӈ`r@\ R<59^U_p4a2j zK0ӷw5nalDm3++Krxً흸xp3/KoFWs%Ӹ,eep' ޳ֻtz0W$2EQOZ)ǹtώźfzD*OSn{jWf8+^kz͢\q}bXf>,f6YwV*:a!LlD;Y,:u?P,7RV'E"Ch4+E"ZuLkUwYw!<Ϸ(^ìpVgfp3@ROQ ȇYubLzl(f}t齦JG ƣywuz[O+עH38ÓlɾT|\hRu ƕyq/qP߶iSyg; lbXp;ixF|S>n#!Fjڗ;hW:3;17{TݿUnvyvjNz]u*-wC˷PI~44l6|t࠮jhG-ο>.J{vī9RC_}06[ӯa:^ܸg9.1NPx2C|JA49m//fOˡ&y9zw/;w'Oճ S+JfYDX˖gT3z3H-ΑqرwH'A)6Isu{wo5ﭟ"#/Z7ȱ*qIq}T*3$AGPJܹ;A&0BerDfFeKfdY :#N 6mUm$nSۥd٪m˲u9 {eOS#tE./b[:}ۦ(-T_uv^;INh:I +17I#T9oVyq`ŅcRTc[jdxv:zY:SdCD}ه%^џС^%46<\*_OsLF`~>' d@Z F7pq(IkUﻲY!m2dVXi[VxC˃Y: M;?dBd(:{!+0Z5R!'/P<~) IW?(3- KlaXehmf [xލ}kTހt1O%dR [%$rI# PVkQ{<$SPX }1V[A}%V A2ݠJcp%.Kq\V"@\|W 4狉U)`:[u<+`W<8/x(e=J|s0m3r *M??/KR u^ >0絾`s_)mZϑ^᳻r3/#T\Ts!T#@ uIѴi笘R&$@RL5 a9~|9g8湭9nq+g+N~Sz9$|hsti/;dS C烅Հg:gk~6 Ix>CU6kB1AixD.5sMdԽ4NX l'>WƍnG](ge}"*rxn )࿟ fflr=0O{31CzeK* ~¬ٓ>f97h|SqA@` ~y݅^o4uR o+t %TqۑuN2RJ #J`,1ۏ߽拯Ef!/qXoT̥pNBix` 1Ù8鹴kJ'#P9\3-Z)?F7(>/VW Ocdr! Cxsd94uJEЃĒ L":RbhbPlB@ >ykYv W%岷 N'wx>*io1tbzo&afd"0s7C!D;T[Wos7.pN:6nNscD!sܭsGC6(䁓h'"x"b'[W BHkpogqcX:w4mB8v)ogW[;v tֱ wvK7e羹[mI Mɴw ~i.vu tֱ wv;nLjܭsGC6(䁓h'Rzwnq7-oݺbB:]؄}P7^vn;ݶA!D;є`qWwor7wEcn줄SL;w4XvM 2ЛŢߨ716l'7\3ʷE*Uz8~[N}s/gcSIvyZMVV\穳p5kt/_1>xb]>_s ^?!'ǫ}EUnB[A " t?!y}lx<.T=կ,U$w `/ <xG aů]ĥLDmie60UMʛhOc2!Tȶ'*S|2E,29HUXkIE^,i(˘L3PȅT4 Z}rՒ s9PbN8SXoh6oVz_MT:\XGs8w ̫E3n0-A^`;ʫ&ZP!~̾~m )pFP*Dm;Z?fŁ7a &idPqM 0ɬj:<>f+h9鲜-˃lXlLC%?f\;[g>2^zOTFЉʫр"m%T2HA^ܴ}Quo}QuAm 5AB`j44 \Wp yER=Yrؗv[O.cA>~A~_J|XN^\ne:tj.5{cfbVFT뭖iQ+Rao}r;FK!ey9NuQe8Ж8蛴HQVXPaPҜ:'mZͦu`ƀ`=]fF )~]5׍B.ϣJcczg ,T3C .(Ag;@7"@  z[1h7DZ;`A@XHo&vg@ej7@raAPC9Ej)MmR~[+sΕ-,uϺh[ 9Tj K-JiTZVXӜ8LCo`FvW'H;<>Loۨ-{ MuPj?is1>'B,G.qfKg4=N^UIAE\ma@R.<+c.uj*(mzTM_ӭ Afl^vjJUOef2ėv3~g_a..u/R{1+/hQJ & ʲV8 iZ3 T%D%)fbm*5_[YvI~ ?QԈ-SsU⎁_E lպ݂$Shv?6 */vļϥa\Zb)ф[+39H \x;0?s yownV3嵓8s "ZԴ!u/ 'v/6|5q$j8B Rd"B8 C c!!&/P{03 +`E47  8H5*ΤvAy@hBВFځ'cౄsq"@x"1"R^K<'c0U6c~B8 2ڜ ~*9L T >YoRlZb /2A wZ^%]IK2Z}t x:!Q -cyaF <) ^nƉL8m[ୁh$Q+UBy1M֒E x K$x/ y4j-J(԰p$D+o JO'(Eqh hQH^/U"9!y/-IP &"n8B;:yxyٵ˯x铣g'{UT&ECeM|"_ h{͓U}~;}vIp5@7k省#-O/#Ũtpcf zuᣫg׏%0rXU_|x`ѫeZ]mOgۭz=0zi|1zoAU?p/U?=aR~z{UK|ݘrkϢ _cn$F{[="WW_N\_z:*lvob4jjB֍Ic/Qv9|Rx۸D$%`۸X = |qLfE<q/pcDZ,q t"4 &Y\0ZFѳnS(͠Vi7R^I'$hUi C"Z/$E]{eہG1x&)@`(Ƴl c= @ )8QkZxD/0"Dz]o22j,TV۬H,EY-bEۛY7AɅkgyS?.O˹q>ļ:< V'^}WO޼>|Q6ONDaTOs9c͢\E1Tq5(~׋%bH~o@r9@Q[wƉ+N7sR ^4ev<ۻRJrwNYZKD" S(d @0  Qm $X/$Lu˜-x)<2F<O3(Έ<%xRP 4 3P-slK(l  G 4 3"ZǵBQ 1p-lMX($h e2Q.e\m[G0qCJ D@K8gx<$QG1j@[x&)9&<:x9qy.Dl S(&FZ/i1>ڂAP5MR'ȥ "b5MR\$DHz<^I˄,e\0Z 4I3ޘx&-xD u~jYq6x_w}}(ʼyq|d'g .Ѕ7&0T1?ͦI 7NN>l׊1:Źk,udz.53(>goLFcnmHDTr*i?{ƭ$2ŗd_TM9qq^֧X F[xql!Ebdچ-[ȃ#׍FhXQj[abi ߊ>75d=& _9 /YHͭ/S۝jchjjښc;ZjDjKNjDjJH:b\[E .d%AiN _eW.t2(D=zR9n8B2uݤd‚ݾFI Quilgx#޳{ކku{= =``[U?sF2-KR#*]:&kAT&A5C!^!J M]O;G!( kQ@?v1 ߇Uo픩 -k* NeT`T$7XIx@e-tzWKE£rhdn#I֑ܙœQL.s7mz2ҢQOsA4:򯑏y9/Syx|al6„e-/gޛ9q_WK7_?ٵvcC82_I=L>hu9]~V Ž4G]jn:@/(N?$;v>d5C*NmƝGm}?e+ky8Ύ|T}jߌOsKe`%}Pd. 5 E,# !4OJR[T#>q&X m±}0<` os6l8[ŝ7 KwG8QXO44Gx< rX'LW!< O#^Ɏ©da 4K*A҆#S Fq0*$f%1 E.ĸdytNxyZL!H<0S$x!ȔfR6]Im^dJ s_t'NF[D$/"Jdy!ōDRP$x$u^dJ! :#mx42%EQA`% ^GxR|^^dJIS"Տl,FI!SwWx2!d(Lliʪt'Km ^dJrCi A ƌ/#NO6^dJʂK!Қ4SiY ^F #MDyA~~rhN+F߁9r!XӖW@p5[+i|GoζC_ A?.D4>}qu[VS%`,i%5XRUY!Pc5cٚ!#j*TB O۰~߻m3-Eڪe,\TK5L0.%SL׸*+.$w}=l5-0Io D܂35ac `X`||%޾뇟I9W|Smq#v C3tg0'i7R̍)eΐy8˲ԶqNG#rw\jq{IC x}_xe`W+<$89ght~rG/;4R? q ߅p8 f [GY31W=pӁl3N܇2 V!0'hk]؞Sv6NgxOٴ|~ѽ~rU^M;7e>h=b~17)OKUܫ*<}Yu # ǁ//~ VJ" Bϟe 1&fc?vv5nHJ ]]+e!9g> $V0ЍhC,#+F~%'ܳN SVk-vT6X֨\Ht%* wFj%a g |2`bo˹9ug~-/UP7Wgj=v Py8h,S|3lZ8Vb5_>Wy "@=y8ߟz8_,7xjpҜ+ow!*BJ3T/7/>q; Xp'wn'an0 #˦п&m(x;h FT~{B^0;BFFZ0[> _f>yl{fއ~[)uG T܂6WؘuUj3qG*$|8SLM*؛( |r-/r  ]W^8*>c\\)72wEf8 .4Ч\ROJs9+ KbNsyy77J#'j:}f-k.+˜\Xg/j N ɕ\(c Vw=ROnI+’cw;wl`U>u}u5pXkLΜyi+ERTݵ֖C8T^ ѦH^(^*CUVX-{[RoW}R2RZY`ɐ'v:&M hd !wY2s Ra<7kexy ?ܾ;wr~7*-qWǣ]O}R0&EZ&%2S0X+0SCG}9 9{RC0]^˷ 94(>'T!kfd}w:Tzb'sW]G]_!sctKr)fW~9E~"UBSXDoݐ4xh*c1eIE R:fw׋A4_0 3@ AlX%8moIHFEqEأЫ0GW!fQUF~P 9qIզ ˇ'qO hjd.uRcCuZr2ksR͂U'Z y : ce))i3A4_oߌ\UisR3w=r %H#dJyXMn$|{șѪwfZv]6OUW婪umڣlvкg1 r:G`JB%:&W/FVr)ɇ OD?'B; ӄQ8riةO.Iw+VLp atSv Fjfk/.AN봚B%tPχo 9@`j"Zr 1 s[yAx+Y"렯PznwW]j(? {!n[bD<*jTl"FRD C BuTI~}!+=SV&r1͟˴=VivobB13%MZ%3-2Af=rNb;%;88GB i^VOrG!.sa!\aԍ pҧ0FiUP¹"Eεٴ Z PLZ Elk3,RnI$ 328ujmJȲ ;%b Ixuz#;5)3|6iz=}0AO}Yzt.&m獦 OekCM~O'qRhT8=ު+KV(운봶u`Op:-L 5GҽuNJFDnJ*.xAizx(BhVFN\1g$eՒ>McAm1;YbgKm`QLģ[GtQb`  XP .:̷SQ;RQL )oVO5Jãk[h$@{g)ЉUk|۶5Q3 :[葁Z'(}N}mY&pu7F3ૣ $}؃h%H#UUDՎH%k(4*l|hX`ghus>1)?m=HʁW?;%jsgF W~z rn,HlyySm˛a0< vTnQ?>H?xO!;X5i7ZHcZ;\nؚƅ3ΡZI7#SR2r7#"5 .~HA)P)lsn>a1vA @$٢x b G%4" y$MskCqx .at[.c_~ssLl4.jY3ښg$uR{za~l]v*嫮]_^_]'k/m"VDNA!kq`A5?|3퉦XߜqlfNWa;Z2s4|gU%|I9M4f5ZC"(2F AZAto[lOؼPT 1Ϋ+%q!_eqᬥ$z'q0\DZS`V϶L/JNl-Q,_;7Em_If샮s[2[f |X~._1^4դHERZҶ^څjHMf)Z(`+OdnX q PŨ?{CēOߛt֘n6_NjMzA]G ֯~ nܳ$r$y!7Yp\aέ9_p`9]`_g| `H4{27BlebF(2hm`UHbܾNP4#d[e$fQXE4^\^ #F"?V.tΓ&h񥦝g̹?XPv[ \uJ'W߿i/?u{K eP\9ߕWy8v "O \19;P/Wwh7WXlBw?߽ N[=ywY͛C9ج}7_9ʯq{ۿ]?4?'ר*3F^uxtv]kc\Edǁg; _P-$iu}roWB`2Xr 37*m8hRH§[qg`)٭*Wrlsѥ}1 6Ay #ZI WV6-ˈZ~*7 &_M&zawg[Z4?Ua8|b ی$@0Z薃:u+C F)XV`lAA uNBf [vnV5 . ժ8[[](sde!RXċRnYe;F_O\0\_'ky9cK_/:mX޷>_ߟ1nF~|@r8vyB~6X`LI`wj]exgT)r~T]/‚ 6R sI A)IŘSHa Af"0s8T>FFa@7Ns@nHN-y/@txg/y{PJg*z3e豨B $O;ivOϮbQfa߁skPH+7j Km8Tle4) ̾` ;՟[φ@ =myx6v܁Zz<~ms: C g2|a}9.7\XX>:LA`B2bpMcAjQ+6+T(|Z4+dս ϗ$jy!fRɈғˋw#oUiϖ%*D/&MKbPBcrSdL ưGnU6{[RȻYա\лbPb:c n[>f"{z1,䑛hMWލpkZ JLv!풻ݪ'ZưGn6UbH2#U;of-reHhmkG#g)PQh\dcq_0ҝ.S *`LMd& oTh@%H)I#BP76Iz48Moi&6 ;I@߯.Fd.cL OFE6!滭ha͹ECgRt0F 8EDMt oX@I}9KM˯y'[k"%o0y2jࢌH!-Q\b]ZuhE2E;[;ʙrE``:''s $'=<{G>Iܻgi1F"qШEP`F'چ=c$+@ +i}/85D-QzG2o4XO h$ZħPGAAĨ)XjcQ*jTl"FṚ @{f9j/5*\˹Bz>6<5:gZ&̦"NGr"` I ] ![|ԄV{rLi ҂:6Ɋ#bKArD%PƀFlz%B֍(ηVixgPj4 ~AC-%ɵc a<\.$"x_A2ưGnU6y1ȻYQөcwܺ+n-ncX#7*"ZػedbP+v.;_FGn660Ž^kl7'֭l=9N}Y!BN;pk!]kpMӺ[Ӹ6gHS nM.m'A+6[*EuR *-Ӧ\"l641bl,EZ;FM՚| gdNj gɒwte3'@.j_ d/ |.p߲۞MV|Rﴋgi1(I>Z+lc." "J(pֻmf%(6 N2F%u>ۛ:@< jښ\[ λbooOj‚n#uK:J)B\xq 5 v ׎1r{ƤScl"cX#7*.u%pݤ..c j{]ҩϻUOMʦP.~yݨ}AVAԎ1Ļ]pA;n-ncX#7R66Nر2gb4 sF^L ^$}d ̔ S f\1e]rτm /LfƾSA[T8[c hF&P@Y>eY X}X8h!Iowm@WzCm6Z^i;5}I8ϕC}͍wGY?k|SUwҨQU)|3.rdpz)/f'/$Um[ND=L$ (AN|^ѓr4N|(%j'4=B8ycRtpEHbc a9 [+7J'Eytw'e ^D2$Lю~nDjclQ;fk[aB{FrQ$c5Bs>a̹P6?w&5{eYo+FnvӜ,")zҶT?4{ZՏiEjB_3z[T<&b #ovS |SnSn v@vy[\۲]m}dk\:=o@fE%ho&CxWdM/j"\MaQ*7kԔhĒ$窳=l? 譖^|OyC-GJS*j*QP 9 H,QLԆ";Muw;B;S Aa|(\DÍa;{BDgusQ(dv68@ 14944ms (19:34:25.474) Feb 01 19:34:25 crc kubenswrapper[4961]: Trace[1825580615]: [14.944961682s] [14.944961682s] END Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.474844 4961 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.475392 4961 trace.go:236] Trace[931626326]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Feb-2026 19:34:12.395) (total time: 13080ms): Feb 01 19:34:25 crc kubenswrapper[4961]: Trace[931626326]: ---"Objects listed" error: 13080ms (19:34:25.475) Feb 01 19:34:25 crc kubenswrapper[4961]: Trace[931626326]: [13.080315575s] [13.080315575s] END Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.475413 4961 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.475550 4961 trace.go:236] Trace[628370330]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Feb-2026 19:34:14.348) (total time: 11126ms): Feb 01 19:34:25 crc kubenswrapper[4961]: Trace[628370330]: ---"Objects listed" error: 11126ms (19:34:25.475) Feb 01 19:34:25 crc kubenswrapper[4961]: Trace[628370330]: [11.126554007s] [11.126554007s] END Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.475595 4961 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.476029 4961 trace.go:236] Trace[1995904819]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Feb-2026 19:34:12.839) (total time: 12636ms): Feb 01 19:34:25 crc kubenswrapper[4961]: Trace[1995904819]: ---"Objects listed" error: 12636ms (19:34:25.475) Feb 01 19:34:25 crc kubenswrapper[4961]: Trace[1995904819]: [12.63612356s] [12.63612356s] END Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.476081 4961 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.476109 4961 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.486520 4961 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.486949 4961 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.487274 4961 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.489219 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.489361 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.489387 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.489432 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.489458 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:25Z","lastTransitionTime":"2026-02-01T19:34:25Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 01 19:34:25 crc kubenswrapper[4961]: E0201 19:34:25.514486 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.524611 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.524693 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.524719 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.524768 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.524796 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:25Z","lastTransitionTime":"2026-02-01T19:34:25Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.539088 4961 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41342->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.539186 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41342->192.168.126.11:17697: read: connection reset by peer" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.539106 4961 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41334->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.539404 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:41334->192.168.126.11:17697: read: connection reset by peer" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.539829 4961 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.539878 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.541251 4961 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.541392 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 01 19:34:25 crc kubenswrapper[4961]: E0201 19:34:25.545158 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.551122 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.551179 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.551191 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.551218 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.551234 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:25Z","lastTransitionTime":"2026-02-01T19:34:25Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 01 19:34:25 crc kubenswrapper[4961]: E0201 19:34:25.564513 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.570685 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.570747 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.570765 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.570796 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.570817 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:25Z","lastTransitionTime":"2026-02-01T19:34:25Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 01 19:34:25 crc kubenswrapper[4961]: E0201 19:34:25.582176 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.587625 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.587693 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.587711 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.587746 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.587767 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:25Z","lastTransitionTime":"2026-02-01T19:34:25Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 01 19:34:25 crc kubenswrapper[4961]: E0201 19:34:25.607122 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:25 crc kubenswrapper[4961]: E0201 19:34:25.607353 4961 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.609687 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.609908 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.610067 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.610219 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.610360 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:25Z","lastTransitionTime":"2026-02-01T19:34:25Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.718761 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.719182 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.719341 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.719487 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.719625 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:25Z","lastTransitionTime":"2026-02-01T19:34:25Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.823595 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.823934 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.824127 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.824303 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.824490 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:25Z","lastTransitionTime":"2026-02-01T19:34:25Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.928117 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.928198 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.928224 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.928267 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:25 crc kubenswrapper[4961]: I0201 19:34:25.928294 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:25Z","lastTransitionTime":"2026-02-01T19:34:25Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.033841 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.033918 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.033941 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.033981 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.034006 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:26Z","lastTransitionTime":"2026-02-01T19:34:26Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.136758 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.136830 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.136849 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.136884 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.136903 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:26Z","lastTransitionTime":"2026-02-01T19:34:26Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.141183 4961 apiserver.go:52] "Watching apiserver" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.147068 4961 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.147584 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.148219 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.148438 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.148663 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.149154 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.149229 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.149454 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.149617 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.149835 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.150018 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.154187 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.154788 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.155255 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.155514 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.155835 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.157020 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.157738 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.158641 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 23:08:15.780764994 +0000 UTC Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.162448 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.162387 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.207884 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.227422 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.241837 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.241955 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.241980 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.242012 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.242060 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:26Z","lastTransitionTime":"2026-02-01T19:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.245218 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.247816 4961 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.263247 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.282069 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.282157 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.282203 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.282286 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283075 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283185 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283220 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283257 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283300 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283334 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283368 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283404 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283438 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283473 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283507 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283545 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283583 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283626 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283662 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283699 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283738 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283782 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.283819 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.287938 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.288730 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.290527 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.291466 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.291761 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.292030 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.292017 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.292497 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.293215 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.293513 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.293869 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294136 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294215 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294255 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294291 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294326 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294360 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294417 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294437 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294596 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294653 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294697 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294735 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294741 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294783 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294830 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294876 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294922 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.294985 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.295086 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.295577 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.295662 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.295722 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.295773 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.295829 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.295880 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.295901 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.295934 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.295975 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.295993 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296091 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296146 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296214 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296273 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296325 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296381 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296431 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296487 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296537 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296589 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296694 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296749 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296802 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296856 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296903 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.296956 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297015 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297104 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297166 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297193 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297219 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297277 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297334 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297386 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297435 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297486 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297537 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297587 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297638 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297726 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297778 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297826 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297877 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297935 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.297990 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298080 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298134 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298184 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298454 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298551 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298622 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298747 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298775 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298808 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298870 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298992 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.299084 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.299139 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.299204 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.299259 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.299314 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.299435 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.299487 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.299541 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.299676 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.299739 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.299791 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.301271 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.301515 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.301728 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.308886 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.309551 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310091 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310351 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310404 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310445 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310486 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310521 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310579 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310615 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310656 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310693 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310728 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310764 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310803 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310836 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310870 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310909 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310941 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310974 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311013 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311079 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311241 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311279 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311314 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311350 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311384 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311419 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311453 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311509 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311544 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311708 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311752 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311790 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311823 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311862 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311897 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311932 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311973 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312011 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312077 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312109 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312143 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312176 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312211 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312252 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312289 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312326 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312363 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312401 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312440 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312474 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312518 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312557 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312592 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312627 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312663 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312700 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312741 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312776 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312812 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312849 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312886 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312922 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312959 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312995 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313031 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313096 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313131 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313168 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313206 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313625 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313670 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313742 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313783 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313818 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313855 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313892 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313929 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314078 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314119 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314153 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314191 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314229 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314265 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314300 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314340 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314375 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314412 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314449 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314486 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314519 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314590 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314641 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314683 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314727 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314767 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314808 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314854 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314895 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314936 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314972 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315019 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315087 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315128 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315167 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315266 4961 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315289 4961 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315313 4961 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315334 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315359 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315381 4961 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315402 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315422 4961 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315443 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315463 4961 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315483 4961 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315505 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315527 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315549 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315570 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315590 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315610 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315632 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315652 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.318816 4961 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.349228 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.298802 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.303306 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.354898 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.303360 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.303676 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.303891 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.303953 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.304186 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.304285 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.304780 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.306137 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.307440 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.307730 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.308211 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.308212 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.308266 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.308609 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.308718 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.308969 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.309132 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.309537 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.309954 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310466 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310561 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310832 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.310947 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311304 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311360 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311345 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311478 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.311771 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312202 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312288 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312698 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.312791 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313473 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.313802 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314249 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.314591 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.315003 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.316519 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.316535 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.317099 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.317294 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.317616 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.317639 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.321646 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.321666 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.321675 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.321841 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.322297 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.322315 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.323228 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.323247 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.323346 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.323418 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.343257 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.343438 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.343534 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.343766 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.343813 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.343853 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.344547 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.344633 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.344816 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.345406 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.345417 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.346072 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.346320 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.346507 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.346507 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.346612 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.346714 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.346797 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.347225 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.347247 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.347406 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.347604 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.347696 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.347958 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.347995 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.349664 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.323946 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.350324 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.350882 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.351456 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.351528 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.351892 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.352493 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.352601 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.352793 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:34:26.852755378 +0000 UTC m=+21.377786801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.352867 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.352896 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.353194 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.353496 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.353497 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.353588 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.353825 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.353856 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.353895 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.353926 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.354314 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.354332 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.354481 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.354499 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.354657 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.354763 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.355177 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.355432 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.355438 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.355747 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.355782 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.356122 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.356490 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.356422 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.356870 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.356982 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.357291 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.357726 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.358944 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.359548 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.360142 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.361718 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.362218 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.362150 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.362597 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.362762 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.362753 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.358833 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.363527 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.364200 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.364579 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.365213 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.368482 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:26.868441375 +0000 UTC m=+21.393472808 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.368895 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:26.868880688 +0000 UTC m=+21.393912131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.372123 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.372497 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.372853 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.372886 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.372898 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.372925 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.372937 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:26Z","lastTransitionTime":"2026-02-01T19:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.357738 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.373694 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.373881 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.374301 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.374488 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.374454 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.374770 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.375857 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.376132 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.376157 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.357782 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.377074 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.377452 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.377862 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.377987 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.378447 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.378580 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.378610 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.378815 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.379013 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.379428 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.380111 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.380446 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.380518 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.381014 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.381213 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.381433 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.383357 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.383584 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.383512 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.384710 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.385002 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.385548 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.386847 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.387410 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.387086 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.387799 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.387384 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.387854 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.387874 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.387912 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.387941 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.388117 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.387510 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.387978 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:26.887950393 +0000 UTC m=+21.412981806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.388810 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:26.888703954 +0000 UTC m=+21.413735387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.395321 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.395481 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.395602 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.396589 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.400745 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.400500 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.402151 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.402589 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.402638 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.403242 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59" exitCode=255 Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.403292 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59"} Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.408521 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.410127 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416571 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416635 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416730 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416746 4961 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416758 4961 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416770 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416786 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416799 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416772 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416858 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416931 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416811 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.416991 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417020 4961 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417056 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417070 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417086 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417161 4961 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417178 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417195 4961 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417218 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417232 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417245 4961 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417260 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417274 4961 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417287 4961 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417300 4961 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417313 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417326 4961 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417338 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417349 4961 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417361 4961 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417377 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417393 4961 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417406 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417420 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417433 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417446 4961 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417458 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417472 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417483 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417495 4961 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417508 4961 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417521 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417536 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417548 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417561 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417575 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417587 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417601 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417613 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417626 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417640 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417654 4961 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417667 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417680 4961 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417696 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417711 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417724 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417754 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417766 4961 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417780 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417793 4961 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417808 4961 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417822 4961 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417834 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417846 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417859 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417873 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417887 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417900 4961 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417914 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417928 4961 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417941 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417957 4961 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417969 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417981 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.417994 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418007 4961 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418019 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418053 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418069 4961 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418082 4961 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418095 4961 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418108 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418120 4961 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418132 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418145 4961 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418157 4961 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418171 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418184 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418197 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418210 4961 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418226 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418239 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418254 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418267 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418278 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418290 4961 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418302 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418315 4961 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418327 4961 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418339 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418351 4961 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418362 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418374 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418386 4961 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418398 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418410 4961 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418421 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418433 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418446 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418459 4961 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418470 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418482 4961 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418494 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418506 4961 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418518 4961 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418529 4961 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418544 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418557 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418569 4961 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418580 4961 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418593 4961 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418605 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418617 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418629 4961 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418643 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418655 4961 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418668 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418680 4961 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418692 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418704 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418715 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418727 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418739 4961 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418751 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418764 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418777 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418789 4961 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418800 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418811 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418827 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418839 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418851 4961 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418862 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418875 4961 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418890 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418902 4961 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418913 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418926 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418938 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418949 4961 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418961 4961 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418973 4961 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418986 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.418999 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419010 4961 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419023 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419055 4961 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419069 4961 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419081 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419092 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419103 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419114 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419137 4961 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419150 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419161 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419174 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419189 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419202 4961 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419215 4961 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419228 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419241 4961 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419256 4961 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.419269 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.438142 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.438751 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.439091 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.439329 4961 scope.go:117] "RemoveContainer" containerID="8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.453600 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.461543 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.471019 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.475086 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.475217 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.475306 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.475393 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.475472 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:26Z","lastTransitionTime":"2026-02-01T19:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.484200 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.488257 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.502653 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.508451 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: W0201 19:34:26.511679 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-57f63a708f7af05610854f1a5a74f65e3d4e5b9232c01d5f3e08d4a17da72e45 WatchSource:0}: Error finding container 57f63a708f7af05610854f1a5a74f65e3d4e5b9232c01d5f3e08d4a17da72e45: Status 404 returned error can't find the container with id 57f63a708f7af05610854f1a5a74f65e3d4e5b9232c01d5f3e08d4a17da72e45 Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.520728 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.522754 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.522873 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.522968 4961 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.523079 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.529354 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: W0201 19:34:26.536466 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-1342156bbc77bc7827e8a4c621c94dc2cb219a44b8d509dcdfe13b3a89c67c4b WatchSource:0}: Error finding container 1342156bbc77bc7827e8a4c621c94dc2cb219a44b8d509dcdfe13b3a89c67c4b: Status 404 returned error can't find the container with id 1342156bbc77bc7827e8a4c621c94dc2cb219a44b8d509dcdfe13b3a89c67c4b Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.546673 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.558862 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.570008 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.578459 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.578485 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.578496 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.578514 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.578526 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:26Z","lastTransitionTime":"2026-02-01T19:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.580170 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.590708 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.605408 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.615844 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.627967 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.640342 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.684250 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.684333 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.684346 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.684367 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.684396 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:26Z","lastTransitionTime":"2026-02-01T19:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.789626 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.789675 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.789690 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.789712 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.789726 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:26Z","lastTransitionTime":"2026-02-01T19:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.892285 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.892325 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.892336 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.892354 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.892365 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:26Z","lastTransitionTime":"2026-02-01T19:34:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.897062 4961 csr.go:261] certificate signing request csr-r7zpv is approved, waiting to be issued Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.921836 4961 csr.go:257] certificate signing request csr-r7zpv is issued Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.928573 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.928652 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.928685 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.928711 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:26 crc kubenswrapper[4961]: I0201 19:34:26.928732 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.928772 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:34:27.928741063 +0000 UTC m=+22.453772486 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.928859 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.928878 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.928889 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.928915 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.928907 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.928940 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:27.928924149 +0000 UTC m=+22.453955572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.928995 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:27.928987061 +0000 UTC m=+22.454018484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.929013 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:27.929005771 +0000 UTC m=+22.454037194 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.929077 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.929133 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.929151 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:26 crc kubenswrapper[4961]: E0201 19:34:26.929246 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:27.929220117 +0000 UTC m=+22.454251550 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.007188 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.007240 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.007253 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.007271 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.007284 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:27Z","lastTransitionTime":"2026-02-01T19:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.110487 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.110524 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.110534 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.110554 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.110566 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:27Z","lastTransitionTime":"2026-02-01T19:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.158861 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 10:26:51.822291188 +0000 UTC Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.213233 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.213269 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.213279 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.213296 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.213307 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:27Z","lastTransitionTime":"2026-02-01T19:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.316248 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.316307 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.316318 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.316340 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.316353 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:27Z","lastTransitionTime":"2026-02-01T19:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.408469 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.410577 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.410799 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.412761 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.412817 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.412831 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"46e8d75cd2fbeb8c6ba6f6f4025f6b8b538d7a8259d897204bd60a7f22a2ad89"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.413947 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1342156bbc77bc7827e8a4c621c94dc2cb219a44b8d509dcdfe13b3a89c67c4b"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.415458 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.415514 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"57f63a708f7af05610854f1a5a74f65e3d4e5b9232c01d5f3e08d4a17da72e45"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.418243 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.418304 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.418317 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.418334 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.418350 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:27Z","lastTransitionTime":"2026-02-01T19:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.435701 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.452202 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.466480 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.476524 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.487681 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.499178 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.510822 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.520978 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.521030 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.521067 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.521092 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.521109 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:27Z","lastTransitionTime":"2026-02-01T19:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.524223 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.537599 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.549493 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.573913 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.593222 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.611571 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.622953 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.622977 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.622985 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.622999 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.623009 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:27Z","lastTransitionTime":"2026-02-01T19:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.627267 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.725505 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.725546 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.725562 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.725584 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.725601 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:27Z","lastTransitionTime":"2026-02-01T19:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.827763 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.827802 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.827812 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.827832 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.827843 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:27Z","lastTransitionTime":"2026-02-01T19:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.829202 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-fgjhv"] Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.829604 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.831115 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-c9pwd"] Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.831245 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wcr65"] Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.831406 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.831832 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c9pwd" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.834197 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.834458 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.834567 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.834996 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.835053 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.836603 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.836897 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.836998 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.837767 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.841944 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.842072 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.842263 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.842319 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.867915 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.884796 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.897577 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.917350 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.922747 4961 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-01 19:29:26 +0000 UTC, rotation deadline is 2026-12-04 03:15:18.563267466 +0000 UTC Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.922830 4961 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7327h40m50.640439591s for next certificate rotation Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.930567 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.930623 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.930638 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.930661 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.930672 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:27Z","lastTransitionTime":"2026-02-01T19:34:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.935609 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.936825 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.936945 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2klsz\" (UniqueName: \"kubernetes.io/projected/0f613a17-3bb1-43f0-8552-0e55a2018f4c-kube-api-access-2klsz\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.937022 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:34:29.936992454 +0000 UTC m=+24.462023877 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937100 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-run-k8s-cni-cncf-io\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937155 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-etc-kubernetes\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937212 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l69z\" (UniqueName: \"kubernetes.io/projected/9a7db599-5d41-4e19-b703-6bb56822b4ff-kube-api-access-8l69z\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937274 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937325 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6n4c\" (UniqueName: \"kubernetes.io/projected/4a4b5d70-50da-41c9-8016-8134283d74bc-kube-api-access-l6n4c\") pod \"node-resolver-c9pwd\" (UID: \"4a4b5d70-50da-41c9-8016-8134283d74bc\") " pod="openshift-dns/node-resolver-c9pwd" Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.937424 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.937447 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.937463 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937424 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-var-lib-cni-multus\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.937502 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:29.937494378 +0000 UTC m=+24.462525791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937531 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-hostroot\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937552 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-conf-dir\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937565 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-run-multus-certs\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937585 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-run-netns\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937614 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-var-lib-kubelet\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937662 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f613a17-3bb1-43f0-8552-0e55a2018f4c-proxy-tls\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937757 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-system-cni-dir\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937826 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-cni-dir\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937875 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-cnibin\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937906 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937934 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4a4b5d70-50da-41c9-8016-8134283d74bc-hosts-file\") pod \"node-resolver-c9pwd\" (UID: \"4a4b5d70-50da-41c9-8016-8134283d74bc\") " pod="openshift-dns/node-resolver-c9pwd" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937953 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-socket-dir-parent\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937976 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.937997 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0f613a17-3bb1-43f0-8552-0e55a2018f4c-rootfs\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.938016 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f613a17-3bb1-43f0-8552-0e55a2018f4c-mcd-auth-proxy-config\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.938053 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a7db599-5d41-4e19-b703-6bb56822b4ff-cni-binary-copy\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.938072 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-var-lib-cni-bin\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.938089 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-daemon-config\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.938116 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.938134 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-os-release\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.938030 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.938236 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:29.938228087 +0000 UTC m=+24.463259510 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.938172 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.938300 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.938314 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.938322 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.938320 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:29.938294549 +0000 UTC m=+24.463325972 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:27 crc kubenswrapper[4961]: E0201 19:34:27.938345 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:29.93833817 +0000 UTC m=+24.463369583 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.952821 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.965974 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.985753 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:27 crc kubenswrapper[4961]: I0201 19:34:27.999815 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.011079 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.024071 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.032386 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.032421 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.032430 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.032446 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.032455 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:28Z","lastTransitionTime":"2026-02-01T19:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.035621 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038658 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a7db599-5d41-4e19-b703-6bb56822b4ff-cni-binary-copy\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038707 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-var-lib-cni-bin\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038727 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-daemon-config\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038757 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-os-release\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038781 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2klsz\" (UniqueName: \"kubernetes.io/projected/0f613a17-3bb1-43f0-8552-0e55a2018f4c-kube-api-access-2klsz\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038802 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-run-k8s-cni-cncf-io\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038823 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-etc-kubernetes\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038839 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l69z\" (UniqueName: \"kubernetes.io/projected/9a7db599-5d41-4e19-b703-6bb56822b4ff-kube-api-access-8l69z\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038869 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6n4c\" (UniqueName: \"kubernetes.io/projected/4a4b5d70-50da-41c9-8016-8134283d74bc-kube-api-access-l6n4c\") pod \"node-resolver-c9pwd\" (UID: \"4a4b5d70-50da-41c9-8016-8134283d74bc\") " pod="openshift-dns/node-resolver-c9pwd" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038892 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-var-lib-cni-multus\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038910 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-hostroot\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038929 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-conf-dir\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038924 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-os-release\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038970 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-run-multus-certs\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.038944 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-run-multus-certs\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039000 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-var-lib-cni-bin\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039024 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-var-lib-cni-multus\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039030 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-run-k8s-cni-cncf-io\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039072 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-run-netns\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039178 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-var-lib-kubelet\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039204 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-conf-dir\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039210 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-etc-kubernetes\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039230 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-var-lib-kubelet\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039257 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-host-run-netns\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039295 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f613a17-3bb1-43f0-8552-0e55a2018f4c-proxy-tls\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039341 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-system-cni-dir\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039446 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-cni-dir\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039456 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a7db599-5d41-4e19-b703-6bb56822b4ff-cni-binary-copy\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039504 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-cnibin\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039506 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-system-cni-dir\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039545 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-cnibin\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039576 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4a4b5d70-50da-41c9-8016-8134283d74bc-hosts-file\") pod \"node-resolver-c9pwd\" (UID: \"4a4b5d70-50da-41c9-8016-8134283d74bc\") " pod="openshift-dns/node-resolver-c9pwd" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039602 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-socket-dir-parent\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039634 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0f613a17-3bb1-43f0-8552-0e55a2018f4c-rootfs\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039651 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f613a17-3bb1-43f0-8552-0e55a2018f4c-mcd-auth-proxy-config\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039655 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-daemon-config\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039817 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-cni-dir\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039854 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0f613a17-3bb1-43f0-8552-0e55a2018f4c-rootfs\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039923 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4a4b5d70-50da-41c9-8016-8134283d74bc-hosts-file\") pod \"node-resolver-c9pwd\" (UID: \"4a4b5d70-50da-41c9-8016-8134283d74bc\") " pod="openshift-dns/node-resolver-c9pwd" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.039936 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-multus-socket-dir-parent\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.040000 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a7db599-5d41-4e19-b703-6bb56822b4ff-hostroot\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.040360 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0f613a17-3bb1-43f0-8552-0e55a2018f4c-mcd-auth-proxy-config\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.047495 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f613a17-3bb1-43f0-8552-0e55a2018f4c-proxy-tls\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.053707 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.058399 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l69z\" (UniqueName: \"kubernetes.io/projected/9a7db599-5d41-4e19-b703-6bb56822b4ff-kube-api-access-8l69z\") pod \"multus-wcr65\" (UID: \"9a7db599-5d41-4e19-b703-6bb56822b4ff\") " pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.061723 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6n4c\" (UniqueName: \"kubernetes.io/projected/4a4b5d70-50da-41c9-8016-8134283d74bc-kube-api-access-l6n4c\") pod \"node-resolver-c9pwd\" (UID: \"4a4b5d70-50da-41c9-8016-8134283d74bc\") " pod="openshift-dns/node-resolver-c9pwd" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.066247 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2klsz\" (UniqueName: \"kubernetes.io/projected/0f613a17-3bb1-43f0-8552-0e55a2018f4c-kube-api-access-2klsz\") pod \"machine-config-daemon-fgjhv\" (UID: \"0f613a17-3bb1-43f0-8552-0e55a2018f4c\") " pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.068060 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.086439 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.099729 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.112281 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.129409 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.135161 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.135234 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.135250 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.135276 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.135294 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:28Z","lastTransitionTime":"2026-02-01T19:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.151713 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.158546 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wcr65" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.159200 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 00:31:39.160978002 +0000 UTC Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.165354 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c9pwd" Feb 01 19:34:28 crc kubenswrapper[4961]: W0201 19:34:28.171885 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a7db599_5d41_4e19_b703_6bb56822b4ff.slice/crio-3daaed155d20570a8d17b5c35b6834766dba25fa57e33c2f2419ad54c1722dfe WatchSource:0}: Error finding container 3daaed155d20570a8d17b5c35b6834766dba25fa57e33c2f2419ad54c1722dfe: Status 404 returned error can't find the container with id 3daaed155d20570a8d17b5c35b6834766dba25fa57e33c2f2419ad54c1722dfe Feb 01 19:34:28 crc kubenswrapper[4961]: W0201 19:34:28.189365 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a4b5d70_50da_41c9_8016_8134283d74bc.slice/crio-2adb86a54d87a2cda3bf8258b0c78cd09460291c93ce4babbdf53051e9ceedfd WatchSource:0}: Error finding container 2adb86a54d87a2cda3bf8258b0c78cd09460291c93ce4babbdf53051e9ceedfd: Status 404 returned error can't find the container with id 2adb86a54d87a2cda3bf8258b0c78cd09460291c93ce4babbdf53051e9ceedfd Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.235173 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.235213 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:28 crc kubenswrapper[4961]: E0201 19:34:28.235359 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:28 crc kubenswrapper[4961]: E0201 19:34:28.235508 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.235861 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:28 crc kubenswrapper[4961]: E0201 19:34:28.235946 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.255009 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.255089 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.255102 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.255123 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.255189 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:28Z","lastTransitionTime":"2026-02-01T19:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.256793 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.257573 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.260721 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.261666 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.263004 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.263746 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.264533 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.266092 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.266956 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.277872 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.282396 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.283489 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.284900 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.285535 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.286890 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.289644 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.290410 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.291362 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.291954 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.292832 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.295598 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.296538 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.297021 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.298938 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.299436 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.300649 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.301496 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.302773 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.303400 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.304402 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.305107 4961 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.305220 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.306923 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.307966 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.308410 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.310079 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.310814 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.311776 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.312430 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.313493 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.313960 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.315009 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.316124 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.319030 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.319556 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.320545 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.321828 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.323552 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.324192 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.324670 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.325536 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.326091 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.327119 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.328119 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.328568 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6dzl2"] Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.329173 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ng92q"] Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.329309 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.330575 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.334239 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.334352 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.334351 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.334387 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.334586 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.334636 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.334696 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.334825 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.334930 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.348128 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.361877 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.361920 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.361933 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.361957 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.361970 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:28Z","lastTransitionTime":"2026-02-01T19:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.361859 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.383782 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.406396 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.419689 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wcr65" event={"ID":"9a7db599-5d41-4e19-b703-6bb56822b4ff","Type":"ContainerStarted","Data":"254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.419738 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wcr65" event={"ID":"9a7db599-5d41-4e19-b703-6bb56822b4ff","Type":"ContainerStarted","Data":"3daaed155d20570a8d17b5c35b6834766dba25fa57e33c2f2419ad54c1722dfe"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.421078 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c9pwd" event={"ID":"4a4b5d70-50da-41c9-8016-8134283d74bc","Type":"ContainerStarted","Data":"2adb86a54d87a2cda3bf8258b0c78cd09460291c93ce4babbdf53051e9ceedfd"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.422412 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.423667 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerStarted","Data":"f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.423693 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerStarted","Data":"bc11417f519dfb8b74f4bf4a1715307d1d959ec014d64529570cdac005c1da60"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.441573 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.454638 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.454693 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-cnibin\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.454715 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-slash\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.454760 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-os-release\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.454820 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-ovn-kubernetes\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.454867 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-system-cni-dir\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.454889 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-etc-openvswitch\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.454914 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-node-log\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.454959 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-systemd-units\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.454996 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-env-overrides\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455019 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-script-lib\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455097 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455116 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-kubelet\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455149 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-var-lib-openvswitch\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455166 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg76h\" (UniqueName: \"kubernetes.io/projected/1dc5aa88-1337-4077-a820-f13902334c80-kube-api-access-vg76h\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455181 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-openvswitch\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455200 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455234 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-ovn\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455254 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dc5aa88-1337-4077-a820-f13902334c80-ovn-node-metrics-cert\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455310 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccxl8\" (UniqueName: \"kubernetes.io/projected/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-kube-api-access-ccxl8\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455328 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-netns\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455344 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-systemd\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455375 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-bin\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455400 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-log-socket\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455421 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-cni-binary-copy\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455452 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-netd\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.455470 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-config\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.459977 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.465687 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.465727 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.465737 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.465753 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.465762 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:28Z","lastTransitionTime":"2026-02-01T19:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.479988 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.500384 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.513835 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.528600 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.545352 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.556852 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-systemd-units\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.556585 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-systemd-units\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.557371 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-env-overrides\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.557514 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-script-lib\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.557584 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.557610 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-kubelet\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.557759 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.557822 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-var-lib-openvswitch\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.557844 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-openvswitch\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.557866 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg76h\" (UniqueName: \"kubernetes.io/projected/1dc5aa88-1337-4077-a820-f13902334c80-kube-api-access-vg76h\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.557924 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-openvswitch\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.557925 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-var-lib-openvswitch\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558001 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558082 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-kubelet\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558108 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-ovn\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558188 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dc5aa88-1337-4077-a820-f13902334c80-ovn-node-metrics-cert\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558236 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-ovn\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558247 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccxl8\" (UniqueName: \"kubernetes.io/projected/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-kube-api-access-ccxl8\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558276 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-netns\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558297 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-systemd\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558343 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-netns\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558350 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-bin\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558352 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-systemd\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558377 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-bin\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558427 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-log-socket\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558490 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-log-socket\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558575 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-cni-binary-copy\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558607 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-netd\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558743 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-config\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558780 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.558807 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-cnibin\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559012 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-slash\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559051 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-os-release\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559078 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-ovn-kubernetes\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559124 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-system-cni-dir\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559151 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-etc-openvswitch\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559191 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-cnibin\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559259 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559176 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-node-log\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559317 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-netd\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559149 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-script-lib\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559365 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-node-log\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559380 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-etc-openvswitch\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559392 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-os-release\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559409 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-ovn-kubernetes\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559435 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-system-cni-dir\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559436 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-slash\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559872 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-env-overrides\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.559985 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-config\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.560093 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.560632 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-cni-binary-copy\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.561434 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.568889 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.568925 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.568937 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.568955 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.568968 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:28Z","lastTransitionTime":"2026-02-01T19:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.569281 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dc5aa88-1337-4077-a820-f13902334c80-ovn-node-metrics-cert\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.574650 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg76h\" (UniqueName: \"kubernetes.io/projected/1dc5aa88-1337-4077-a820-f13902334c80-kube-api-access-vg76h\") pod \"ovnkube-node-ng92q\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.575194 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.576711 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccxl8\" (UniqueName: \"kubernetes.io/projected/d6fb02f0-671c-4a38-bda0-fa32ecfd6930-kube-api-access-ccxl8\") pod \"multus-additional-cni-plugins-6dzl2\" (UID: \"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\") " pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.595755 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.608176 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.629715 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.652728 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.669205 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.672031 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.672106 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.672120 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.672141 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.672159 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:28Z","lastTransitionTime":"2026-02-01T19:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.676327 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:28 crc kubenswrapper[4961]: W0201 19:34:28.679557 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6fb02f0_671c_4a38_bda0_fa32ecfd6930.slice/crio-cb0271c57aac7c9740c878227f9c0908f36cb5107b55c067d1e4d4a559c8010a WatchSource:0}: Error finding container cb0271c57aac7c9740c878227f9c0908f36cb5107b55c067d1e4d4a559c8010a: Status 404 returned error can't find the container with id cb0271c57aac7c9740c878227f9c0908f36cb5107b55c067d1e4d4a559c8010a Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.686249 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.706965 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.712729 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.722534 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.725603 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.728442 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.749819 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.768832 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.775138 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.775191 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.775206 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.775230 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.775246 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:28Z","lastTransitionTime":"2026-02-01T19:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.781871 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.795454 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.813299 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.828187 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.839818 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.854282 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.871294 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.878902 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.878963 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.878977 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.879003 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.879066 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:28Z","lastTransitionTime":"2026-02-01T19:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.894405 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.911165 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.931706 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.954062 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.977229 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.981225 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.981297 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.981317 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.981342 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.981360 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:28Z","lastTransitionTime":"2026-02-01T19:34:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:28 crc kubenswrapper[4961]: I0201 19:34:28.997885 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:28Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.084617 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.084671 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.084697 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.084713 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.084724 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:29Z","lastTransitionTime":"2026-02-01T19:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.160011 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 17:40:20.366733134 +0000 UTC Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.187756 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.187800 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.187811 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.187827 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.187836 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:29Z","lastTransitionTime":"2026-02-01T19:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.290081 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.290120 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.290130 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.290143 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.290152 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:29Z","lastTransitionTime":"2026-02-01T19:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.393869 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.393914 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.393926 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.393943 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.393953 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:29Z","lastTransitionTime":"2026-02-01T19:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.428502 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerStarted","Data":"f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.430589 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b" exitCode=0 Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.430643 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.430671 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"6f1ef0b4b17a5cf9cee32bbc5c6d5e1c67b0d30fb8254462065740c74bccf6cc"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.434223 4961 generic.go:334] "Generic (PLEG): container finished" podID="d6fb02f0-671c-4a38-bda0-fa32ecfd6930" containerID="ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906" exitCode=0 Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.434333 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" event={"ID":"d6fb02f0-671c-4a38-bda0-fa32ecfd6930","Type":"ContainerDied","Data":"ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.434435 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" event={"ID":"d6fb02f0-671c-4a38-bda0-fa32ecfd6930","Type":"ContainerStarted","Data":"cb0271c57aac7c9740c878227f9c0908f36cb5107b55c067d1e4d4a559c8010a"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.436812 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.441561 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c9pwd" event={"ID":"4a4b5d70-50da-41c9-8016-8134283d74bc","Type":"ContainerStarted","Data":"9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.444434 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.471193 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.485376 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.498087 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.499115 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.499140 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.499150 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.499165 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.499174 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:29Z","lastTransitionTime":"2026-02-01T19:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.509568 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.528612 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.544835 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.566328 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.587183 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.598894 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.602208 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.602260 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.602269 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.602287 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.602321 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:29Z","lastTransitionTime":"2026-02-01T19:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.612708 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.629925 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.641480 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.654383 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.671676 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.688963 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.701442 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.704456 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.704501 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.704513 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.704530 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.704543 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:29Z","lastTransitionTime":"2026-02-01T19:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.714269 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.722497 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.731213 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.742732 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.753808 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.765813 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.775557 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.793689 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.808066 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.808114 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.808123 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.808135 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.808162 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:29Z","lastTransitionTime":"2026-02-01T19:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.809161 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:29Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.910325 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.910580 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.910590 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.910602 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.910612 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:29Z","lastTransitionTime":"2026-02-01T19:34:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.979442 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.979794 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:34:33.97976084 +0000 UTC m=+28.504792263 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.979857 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.979924 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.979975 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:29 crc kubenswrapper[4961]: I0201 19:34:29.980003 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980065 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980094 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980109 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980157 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980156 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980213 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980235 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980173 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:33.98014613 +0000 UTC m=+28.505177563 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980252 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980275 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:33.980253253 +0000 UTC m=+28.505284776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980292 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:33.980283874 +0000 UTC m=+28.505315417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:29 crc kubenswrapper[4961]: E0201 19:34:29.980311 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:33.980303234 +0000 UTC m=+28.505334787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.012719 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.012747 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.012755 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.012768 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.012779 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:30Z","lastTransitionTime":"2026-02-01T19:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.115882 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.115909 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.115918 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.115932 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.115941 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:30Z","lastTransitionTime":"2026-02-01T19:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.160812 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 06:06:35.452004035 +0000 UTC Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.218622 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.218899 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.218908 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.218921 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.218931 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:30Z","lastTransitionTime":"2026-02-01T19:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.236245 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:30 crc kubenswrapper[4961]: E0201 19:34:30.236362 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.236393 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.236460 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:30 crc kubenswrapper[4961]: E0201 19:34:30.236556 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:30 crc kubenswrapper[4961]: E0201 19:34:30.236610 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.321556 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.321599 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.321611 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.321628 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.321640 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:30Z","lastTransitionTime":"2026-02-01T19:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.423521 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.423552 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.423561 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.423574 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.423582 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:30Z","lastTransitionTime":"2026-02-01T19:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.447379 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.447428 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.447441 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.447452 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.447462 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.447586 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.449214 4961 generic.go:334] "Generic (PLEG): container finished" podID="d6fb02f0-671c-4a38-bda0-fa32ecfd6930" containerID="453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165" exitCode=0 Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.449251 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" event={"ID":"d6fb02f0-671c-4a38-bda0-fa32ecfd6930","Type":"ContainerDied","Data":"453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.469605 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.482849 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.493692 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.504325 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.513409 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.523694 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.526571 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.526599 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.526607 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.526620 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.526629 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:30Z","lastTransitionTime":"2026-02-01T19:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.535340 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.554600 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.567334 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.579165 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.589489 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.607568 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.629641 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.629677 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.629688 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.629704 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.629717 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:30Z","lastTransitionTime":"2026-02-01T19:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.630648 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.731791 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.731820 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.731829 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.731842 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.731850 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:30Z","lastTransitionTime":"2026-02-01T19:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.834155 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.834190 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.834198 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.834512 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.834536 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:30Z","lastTransitionTime":"2026-02-01T19:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.923632 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5pnzb"] Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.923941 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5pnzb" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.925667 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.925811 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.925874 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.925979 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.937400 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.937729 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.937764 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.937780 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.937802 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.937819 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:30Z","lastTransitionTime":"2026-02-01T19:34:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.948784 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.961932 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.976280 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.988762 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9a8256b0-9e0d-4560-bbae-08a628c26a2f-serviceca\") pod \"node-ca-5pnzb\" (UID: \"9a8256b0-9e0d-4560-bbae-08a628c26a2f\") " pod="openshift-image-registry/node-ca-5pnzb" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.988843 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a8256b0-9e0d-4560-bbae-08a628c26a2f-host\") pod \"node-ca-5pnzb\" (UID: \"9a8256b0-9e0d-4560-bbae-08a628c26a2f\") " pod="openshift-image-registry/node-ca-5pnzb" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.988918 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm66x\" (UniqueName: \"kubernetes.io/projected/9a8256b0-9e0d-4560-bbae-08a628c26a2f-kube-api-access-xm66x\") pod \"node-ca-5pnzb\" (UID: \"9a8256b0-9e0d-4560-bbae-08a628c26a2f\") " pod="openshift-image-registry/node-ca-5pnzb" Feb 01 19:34:30 crc kubenswrapper[4961]: I0201 19:34:30.994521 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:30Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.010909 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.026699 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.039892 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.039922 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.039932 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.039944 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.039954 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:31Z","lastTransitionTime":"2026-02-01T19:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.045515 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.058157 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.069908 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.084230 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.089707 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm66x\" (UniqueName: \"kubernetes.io/projected/9a8256b0-9e0d-4560-bbae-08a628c26a2f-kube-api-access-xm66x\") pod \"node-ca-5pnzb\" (UID: \"9a8256b0-9e0d-4560-bbae-08a628c26a2f\") " pod="openshift-image-registry/node-ca-5pnzb" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.089750 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9a8256b0-9e0d-4560-bbae-08a628c26a2f-serviceca\") pod \"node-ca-5pnzb\" (UID: \"9a8256b0-9e0d-4560-bbae-08a628c26a2f\") " pod="openshift-image-registry/node-ca-5pnzb" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.089777 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a8256b0-9e0d-4560-bbae-08a628c26a2f-host\") pod \"node-ca-5pnzb\" (UID: \"9a8256b0-9e0d-4560-bbae-08a628c26a2f\") " pod="openshift-image-registry/node-ca-5pnzb" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.089835 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a8256b0-9e0d-4560-bbae-08a628c26a2f-host\") pod \"node-ca-5pnzb\" (UID: \"9a8256b0-9e0d-4560-bbae-08a628c26a2f\") " pod="openshift-image-registry/node-ca-5pnzb" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.090882 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9a8256b0-9e0d-4560-bbae-08a628c26a2f-serviceca\") pod \"node-ca-5pnzb\" (UID: \"9a8256b0-9e0d-4560-bbae-08a628c26a2f\") " pod="openshift-image-registry/node-ca-5pnzb" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.095254 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.105814 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm66x\" (UniqueName: \"kubernetes.io/projected/9a8256b0-9e0d-4560-bbae-08a628c26a2f-kube-api-access-xm66x\") pod \"node-ca-5pnzb\" (UID: \"9a8256b0-9e0d-4560-bbae-08a628c26a2f\") " pod="openshift-image-registry/node-ca-5pnzb" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.110323 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.122867 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.141466 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.141492 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.141503 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.141524 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.141533 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:31Z","lastTransitionTime":"2026-02-01T19:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.161153 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 14:17:20.880693827 +0000 UTC Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.236049 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5pnzb" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.243458 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.243490 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.243502 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.243516 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.243528 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:31Z","lastTransitionTime":"2026-02-01T19:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.346018 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.346083 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.346096 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.346112 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.346125 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:31Z","lastTransitionTime":"2026-02-01T19:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.448242 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.448286 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.448299 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.448315 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.448328 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:31Z","lastTransitionTime":"2026-02-01T19:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.451687 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5pnzb" event={"ID":"9a8256b0-9e0d-4560-bbae-08a628c26a2f","Type":"ContainerStarted","Data":"24168202475ab7e5e676827d4ad78c761d10ffa97eb592dc0b52fd4899fd99f0"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.454002 4961 generic.go:334] "Generic (PLEG): container finished" podID="d6fb02f0-671c-4a38-bda0-fa32ecfd6930" containerID="fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444" exitCode=0 Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.454065 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" event={"ID":"d6fb02f0-671c-4a38-bda0-fa32ecfd6930","Type":"ContainerDied","Data":"fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.465984 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.477510 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.488619 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.502094 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.514290 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.537901 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.550946 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.550977 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.550986 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.550999 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.551008 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:31Z","lastTransitionTime":"2026-02-01T19:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.554063 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.566782 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.585208 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.594971 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.604084 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.615029 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.634940 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.647805 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.654053 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.654166 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.654224 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.654289 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.654344 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:31Z","lastTransitionTime":"2026-02-01T19:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.756619 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.756673 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.756684 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.756700 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.756713 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:31Z","lastTransitionTime":"2026-02-01T19:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.858852 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.859156 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.859260 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.859341 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.859417 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:31Z","lastTransitionTime":"2026-02-01T19:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.946919 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.951843 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.955927 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.962300 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.962346 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.962355 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.962370 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.962381 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:31Z","lastTransitionTime":"2026-02-01T19:34:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.963749 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.975930 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:31 crc kubenswrapper[4961]: I0201 19:34:31.985543 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:31Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.001928 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.014465 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.024571 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.037817 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.049191 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.064978 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.065075 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.065091 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.065115 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.065128 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:32Z","lastTransitionTime":"2026-02-01T19:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.066906 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.081911 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.105196 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.119919 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.136756 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.160136 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.162113 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 09:19:31.979116254 +0000 UTC Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.166998 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.167058 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.167070 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.167088 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.167100 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:32Z","lastTransitionTime":"2026-02-01T19:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.182374 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.205820 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.235187 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.235243 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.235296 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:32 crc kubenswrapper[4961]: E0201 19:34:32.235337 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:32 crc kubenswrapper[4961]: E0201 19:34:32.235438 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:32 crc kubenswrapper[4961]: E0201 19:34:32.235515 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.244024 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.255237 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.265866 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.269369 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.269404 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.269413 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.269426 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.269435 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:32Z","lastTransitionTime":"2026-02-01T19:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.299553 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.352092 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.371660 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.371706 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.371719 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.371737 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.371749 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:32Z","lastTransitionTime":"2026-02-01T19:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.381698 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.423269 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.460829 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.463278 4961 generic.go:334] "Generic (PLEG): container finished" podID="d6fb02f0-671c-4a38-bda0-fa32ecfd6930" containerID="dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602" exitCode=0 Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.463348 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" event={"ID":"d6fb02f0-671c-4a38-bda0-fa32ecfd6930","Type":"ContainerDied","Data":"dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.464520 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5pnzb" event={"ID":"9a8256b0-9e0d-4560-bbae-08a628c26a2f","Type":"ContainerStarted","Data":"c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.466500 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.473348 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.473380 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.473389 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.473406 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.473418 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:32Z","lastTransitionTime":"2026-02-01T19:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.504709 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.540116 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.575826 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.575870 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.575886 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.575908 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.575924 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:32Z","lastTransitionTime":"2026-02-01T19:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.585560 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.624126 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.659788 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.678780 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.678834 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.678844 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.678859 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.678937 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:32Z","lastTransitionTime":"2026-02-01T19:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.701240 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.744678 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.782564 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.782632 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.782655 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.782684 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.782706 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:32Z","lastTransitionTime":"2026-02-01T19:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.791344 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.824061 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.862342 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.884764 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.884807 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.884820 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.884838 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.884850 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:32Z","lastTransitionTime":"2026-02-01T19:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.902293 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.961481 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.982630 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:32Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.987222 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.987259 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.987267 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.987281 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:32 crc kubenswrapper[4961]: I0201 19:34:32.987292 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:32Z","lastTransitionTime":"2026-02-01T19:34:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.021857 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.062851 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.089485 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.089681 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.089740 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.089803 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.089865 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:33Z","lastTransitionTime":"2026-02-01T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.100639 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.140953 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.162586 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 03:38:32.187263886 +0000 UTC Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.180589 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.191900 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.191939 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.191948 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.191962 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.191971 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:33Z","lastTransitionTime":"2026-02-01T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.226368 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.260635 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.294330 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.294714 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.295067 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.295365 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.295646 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:33Z","lastTransitionTime":"2026-02-01T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.398404 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.398685 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.398836 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.398971 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.399117 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:33Z","lastTransitionTime":"2026-02-01T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.477599 4961 generic.go:334] "Generic (PLEG): container finished" podID="d6fb02f0-671c-4a38-bda0-fa32ecfd6930" containerID="9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211" exitCode=0 Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.479290 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" event={"ID":"d6fb02f0-671c-4a38-bda0-fa32ecfd6930","Type":"ContainerDied","Data":"9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211"} Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.501857 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.502437 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.502600 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.502907 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.503251 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.503531 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:33Z","lastTransitionTime":"2026-02-01T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.534164 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.546468 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.558185 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.571771 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.589725 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.610355 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.623498 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.624388 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.624425 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.624434 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.624449 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.624462 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:33Z","lastTransitionTime":"2026-02-01T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.640122 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.665689 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.701548 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.726797 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.726826 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.726835 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.726848 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.726857 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:33Z","lastTransitionTime":"2026-02-01T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.739960 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.780933 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.826440 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.829062 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.829111 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.829122 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.829140 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.829151 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:33Z","lastTransitionTime":"2026-02-01T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.861969 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:33Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.931081 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.931139 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.931152 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.931171 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:33 crc kubenswrapper[4961]: I0201 19:34:33.931183 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:33Z","lastTransitionTime":"2026-02-01T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.016347 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.016463 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.016494 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.016527 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.016554 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.016669 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.016724 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:42.016707556 +0000 UTC m=+36.541738999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.017099 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.017143 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.017165 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.017178 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.017191 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:34:42.017157128 +0000 UTC m=+36.542188591 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.017261 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:42.01724798 +0000 UTC m=+36.542279433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.017149 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.017294 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.017349 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:42.017337282 +0000 UTC m=+36.542368745 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.017110 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.017420 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:42.017404704 +0000 UTC m=+36.542436167 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.033997 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.034029 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.034049 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.034063 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.034072 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:34Z","lastTransitionTime":"2026-02-01T19:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.136167 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.136207 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.136215 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.136231 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.136240 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:34Z","lastTransitionTime":"2026-02-01T19:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.164422 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 15:21:51.473755738 +0000 UTC Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.235182 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.235216 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.235260 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.235308 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.235425 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:34 crc kubenswrapper[4961]: E0201 19:34:34.235514 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.238600 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.238632 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.238640 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.238655 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.238666 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:34Z","lastTransitionTime":"2026-02-01T19:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.341358 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.341390 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.341399 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.341412 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.341421 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:34Z","lastTransitionTime":"2026-02-01T19:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.444250 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.444290 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.444301 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.444320 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.444330 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:34Z","lastTransitionTime":"2026-02-01T19:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.503649 4961 generic.go:334] "Generic (PLEG): container finished" podID="d6fb02f0-671c-4a38-bda0-fa32ecfd6930" containerID="b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe" exitCode=0 Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.503709 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" event={"ID":"d6fb02f0-671c-4a38-bda0-fa32ecfd6930","Type":"ContainerDied","Data":"b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe"} Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.520874 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.541793 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.546207 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.546312 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.546413 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.546508 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.546585 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:34Z","lastTransitionTime":"2026-02-01T19:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.559783 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.574108 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.588951 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.603068 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.623101 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.638836 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.648205 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.648254 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.648267 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.648288 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.648302 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:34Z","lastTransitionTime":"2026-02-01T19:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.652292 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.671684 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.686013 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.703008 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.729145 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.742838 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.750865 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.750894 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.750903 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.750915 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.750923 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:34Z","lastTransitionTime":"2026-02-01T19:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.763536 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:34Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.852871 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.852942 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.852967 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.852998 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.853071 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:34Z","lastTransitionTime":"2026-02-01T19:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.955343 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.955408 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.955425 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.955451 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:34 crc kubenswrapper[4961]: I0201 19:34:34.955478 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:34Z","lastTransitionTime":"2026-02-01T19:34:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.058989 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.059027 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.059048 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.059063 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.059073 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.160980 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.161030 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.161062 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.161083 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.161096 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.165310 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:08:31.47214115 +0000 UTC Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.264139 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.264178 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.264187 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.264200 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.264209 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.367557 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.367604 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.367619 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.367639 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.367654 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.470778 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.470823 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.470832 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.470850 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.470863 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.512388 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.512930 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.517957 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" event={"ID":"d6fb02f0-671c-4a38-bda0-fa32ecfd6930","Type":"ContainerStarted","Data":"504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.535434 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.553405 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.569543 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.584304 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.605859 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.605897 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.605909 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.605930 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.605944 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.612912 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.619706 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.646026 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.662519 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.682780 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.702339 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.708402 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.708452 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.708470 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.708495 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.708512 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.735804 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.752213 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.752291 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.752315 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.752347 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.752378 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.755992 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: E0201 19:34:35.769662 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.773412 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.774308 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.774392 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.774423 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.774457 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.774479 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.789522 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: E0201 19:34:35.797451 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.801508 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.801542 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.801553 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.801570 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.801582 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.804781 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: E0201 19:34:35.821217 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.822344 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.830824 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.830888 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.830906 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.830930 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.830950 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.839371 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: E0201 19:34:35.848212 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.852800 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.853001 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.853143 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.853262 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.853351 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.854951 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: E0201 19:34:35.873174 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: E0201 19:34:35.873333 4961 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.875255 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.875345 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.875403 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.875418 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.875441 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.875458 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.891192 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.903923 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.918758 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.934500 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.947907 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.968390 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:35Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.977980 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.978019 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.978031 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.978064 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.978073 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:35Z","lastTransitionTime":"2026-02-01T19:34:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.992234 4961 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 01 19:34:35 crc kubenswrapper[4961]: I0201 19:34:35.993081 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd/pods/etcd-crc/status\": read tcp 38.102.83.119:32786->38.102.83.119:6443: use of closed network connection" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.029482 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.048652 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.069057 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.081069 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.081104 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.081112 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.081129 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.081140 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:36Z","lastTransitionTime":"2026-02-01T19:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.084094 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.096119 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.165876 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 10:46:17.014594416 +0000 UTC Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.184513 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.184570 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.184580 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.184597 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.184606 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:36Z","lastTransitionTime":"2026-02-01T19:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.234881 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.234913 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:36 crc kubenswrapper[4961]: E0201 19:34:36.235018 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.234938 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:36 crc kubenswrapper[4961]: E0201 19:34:36.235249 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:36 crc kubenswrapper[4961]: E0201 19:34:36.235406 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.251455 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.263446 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.273749 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.283308 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.287624 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.287754 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.287791 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.287832 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.287863 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:36Z","lastTransitionTime":"2026-02-01T19:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.303710 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.316765 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.334265 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.350562 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.367636 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.388723 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.390141 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.390205 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.390219 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.390239 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.390253 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:36Z","lastTransitionTime":"2026-02-01T19:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.407163 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.430359 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.454435 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.472911 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.493640 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.493682 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.493693 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.493711 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.493723 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:36Z","lastTransitionTime":"2026-02-01T19:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.496174 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.521963 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.522108 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.555206 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.574471 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.587326 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.596616 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.596666 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.596687 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.596712 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.596730 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:36Z","lastTransitionTime":"2026-02-01T19:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.603276 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.617246 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.636833 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.654921 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.669564 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.686882 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.699992 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.700063 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.700075 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.700094 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.700105 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:36Z","lastTransitionTime":"2026-02-01T19:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.705246 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.724437 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.757812 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.782888 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.802642 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.802680 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.802690 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.802706 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.802719 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:36Z","lastTransitionTime":"2026-02-01T19:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.804115 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.831696 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.904952 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.904981 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.904989 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.905001 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.905009 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:36Z","lastTransitionTime":"2026-02-01T19:34:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:36 crc kubenswrapper[4961]: I0201 19:34:36.911092 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:36Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.007402 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.007441 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.007451 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.007466 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.007477 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:37Z","lastTransitionTime":"2026-02-01T19:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.109523 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.109561 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.109572 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.109590 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.109601 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:37Z","lastTransitionTime":"2026-02-01T19:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.166637 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 18:31:05.13955592 +0000 UTC Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.211274 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.211302 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.211310 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.211323 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.211332 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:37Z","lastTransitionTime":"2026-02-01T19:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.313800 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.313859 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.313898 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.313941 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.313959 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:37Z","lastTransitionTime":"2026-02-01T19:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.418350 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.418411 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.418433 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.418464 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.418488 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:37Z","lastTransitionTime":"2026-02-01T19:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.523601 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.523664 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.523684 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.523711 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.523729 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:37Z","lastTransitionTime":"2026-02-01T19:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.627571 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.627632 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.627650 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.627674 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.627691 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:37Z","lastTransitionTime":"2026-02-01T19:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.730892 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.730936 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.730945 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.730959 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.730969 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:37Z","lastTransitionTime":"2026-02-01T19:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.834653 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.834728 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.834750 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.834774 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.834790 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:37Z","lastTransitionTime":"2026-02-01T19:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.937609 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.937662 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.937679 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.937701 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:37 crc kubenswrapper[4961]: I0201 19:34:37.937719 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:37Z","lastTransitionTime":"2026-02-01T19:34:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.040687 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.040741 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.040766 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.040791 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.040810 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:38Z","lastTransitionTime":"2026-02-01T19:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.143651 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.143706 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.143731 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.143763 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.143786 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:38Z","lastTransitionTime":"2026-02-01T19:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.166807 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 18:33:44.211893044 +0000 UTC Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.234476 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.234551 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:38 crc kubenswrapper[4961]: E0201 19:34:38.234591 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:38 crc kubenswrapper[4961]: E0201 19:34:38.234690 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.234781 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:38 crc kubenswrapper[4961]: E0201 19:34:38.234855 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.245974 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.246004 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.246012 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.246026 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.246051 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:38Z","lastTransitionTime":"2026-02-01T19:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.348981 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.349078 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.349104 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.349133 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.349159 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:38Z","lastTransitionTime":"2026-02-01T19:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.451908 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.451944 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.451952 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.451965 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.451976 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:38Z","lastTransitionTime":"2026-02-01T19:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.532907 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/0.log" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.537360 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d" exitCode=1 Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.537423 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d"} Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.540574 4961 scope.go:117] "RemoveContainer" containerID="e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.554883 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.555401 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.555424 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.555453 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.555478 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:38Z","lastTransitionTime":"2026-02-01T19:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.564007 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.582173 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.598530 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.616825 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.638888 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.656689 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.659335 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.659392 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.659430 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.659462 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.659487 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:38Z","lastTransitionTime":"2026-02-01T19:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.681806 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.721380 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:37Z\\\",\\\"message\\\":\\\"te (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 19:34:37.507709 6256 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 19:34:37.507732 6256 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0201 19:34:37.507761 6256 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0201 19:34:37.507840 6256 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0201 19:34:37.507850 6256 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 19:34:37.507877 6256 handler.go:208] Removed *v1.Node event handler 7\\\\nI0201 19:34:37.507895 6256 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 19:34:37.507903 6256 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 19:34:37.507895 6256 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0201 19:34:37.507935 6256 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0201 19:34:37.507949 6256 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0201 19:34:37.507961 6256 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 19:34:37.508196 6256 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0201 19:34:37.508256 6256 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.746156 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.763089 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.763125 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.763137 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.763155 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.763166 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:38Z","lastTransitionTime":"2026-02-01T19:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.765468 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.783391 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.803938 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.837609 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.850668 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.864511 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:38Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.865580 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.865612 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.865624 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.865639 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.865649 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:38Z","lastTransitionTime":"2026-02-01T19:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.967662 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.967712 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.967727 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.967749 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:38 crc kubenswrapper[4961]: I0201 19:34:38.967765 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:38Z","lastTransitionTime":"2026-02-01T19:34:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.070947 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.070989 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.071000 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.071019 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.071036 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:39Z","lastTransitionTime":"2026-02-01T19:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.207598 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 19:17:56.534732758 +0000 UTC Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.209190 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.209224 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.209233 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.209246 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.209259 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:39Z","lastTransitionTime":"2026-02-01T19:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.312209 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.312254 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.312264 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.312280 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.312290 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:39Z","lastTransitionTime":"2026-02-01T19:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.414447 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.414485 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.414494 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.414507 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.414516 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:39Z","lastTransitionTime":"2026-02-01T19:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.518207 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.518245 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.518256 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.518270 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.518282 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:39Z","lastTransitionTime":"2026-02-01T19:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.541840 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/0.log" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.544284 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c"} Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.544712 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.557022 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.571372 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.585305 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.598989 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.613123 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.620633 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.620667 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.620678 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.620696 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.620706 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:39Z","lastTransitionTime":"2026-02-01T19:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.624104 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.636161 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.670794 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.701848 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:37Z\\\",\\\"message\\\":\\\"te (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 19:34:37.507709 6256 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 19:34:37.507732 6256 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0201 19:34:37.507761 6256 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0201 19:34:37.507840 6256 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0201 19:34:37.507850 6256 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 19:34:37.507877 6256 handler.go:208] Removed *v1.Node event handler 7\\\\nI0201 19:34:37.507895 6256 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 19:34:37.507903 6256 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 19:34:37.507895 6256 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0201 19:34:37.507935 6256 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0201 19:34:37.507949 6256 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0201 19:34:37.507961 6256 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 19:34:37.508196 6256 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0201 19:34:37.508256 6256 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.722543 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.722586 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.722595 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.722610 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.722621 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:39Z","lastTransitionTime":"2026-02-01T19:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.728388 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.740406 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.754224 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.766841 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.778628 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.790620 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.824621 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.824663 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.824671 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.824685 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.824695 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:39Z","lastTransitionTime":"2026-02-01T19:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.927007 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.927090 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.927103 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.927120 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:39 crc kubenswrapper[4961]: I0201 19:34:39.927133 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:39Z","lastTransitionTime":"2026-02-01T19:34:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.030158 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.030194 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.030202 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.030218 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.030229 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:40Z","lastTransitionTime":"2026-02-01T19:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.132781 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.132856 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.132875 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.132900 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.132917 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:40Z","lastTransitionTime":"2026-02-01T19:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.208164 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 04:01:54.173809476 +0000 UTC Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.234294 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:40 crc kubenswrapper[4961]: E0201 19:34:40.234471 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.235026 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:40 crc kubenswrapper[4961]: E0201 19:34:40.235173 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.235497 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:40 crc kubenswrapper[4961]: E0201 19:34:40.235607 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.237204 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.237247 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.237256 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.237270 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.237280 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:40Z","lastTransitionTime":"2026-02-01T19:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.340153 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.340205 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.340222 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.340245 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.340262 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:40Z","lastTransitionTime":"2026-02-01T19:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.443708 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.443762 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.443779 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.443802 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.443819 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:40Z","lastTransitionTime":"2026-02-01T19:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.546482 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.546560 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.546578 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.546601 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.546619 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:40Z","lastTransitionTime":"2026-02-01T19:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.549674 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/1.log" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.550552 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/0.log" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.553743 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c" exitCode=1 Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.553797 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.553867 4961 scope.go:117] "RemoveContainer" containerID="e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.555197 4961 scope.go:117] "RemoveContainer" containerID="9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c" Feb 01 19:34:40 crc kubenswrapper[4961]: E0201 19:34:40.555518 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.590947 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.608875 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.631832 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.651567 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.651628 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.651647 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.651672 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.651690 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:40Z","lastTransitionTime":"2026-02-01T19:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.654088 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.671927 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.688922 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.702184 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.716196 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.728412 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.738536 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt"] Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.739380 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.741209 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.741626 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.744331 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.753686 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.753723 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.753733 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.753748 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.753759 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:40Z","lastTransitionTime":"2026-02-01T19:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.759612 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.772725 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.784640 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.798823 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.815428 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:37Z\\\",\\\"message\\\":\\\"te (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 19:34:37.507709 6256 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 19:34:37.507732 6256 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0201 19:34:37.507761 6256 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0201 19:34:37.507840 6256 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0201 19:34:37.507850 6256 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 19:34:37.507877 6256 handler.go:208] Removed *v1.Node event handler 7\\\\nI0201 19:34:37.507895 6256 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 19:34:37.507903 6256 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 19:34:37.507895 6256 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0201 19:34:37.507935 6256 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0201 19:34:37.507949 6256 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0201 19:34:37.507961 6256 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 19:34:37.508196 6256 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0201 19:34:37.508256 6256 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:39Z\\\",\\\"message\\\":\\\"led to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z]\\\\nI0201 19:34:39.628993 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0201 19:34:39.628999 6384 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-c9pwd in node crc\\\\nI0201 19:34:39.629001 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629004 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629010 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629012 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629009 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-6dzl2\\\\nI0201 19:34:39.628971 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ng92q after 0 f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.838771 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.852829 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.856763 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.856788 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.856799 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.856815 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.856825 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:40Z","lastTransitionTime":"2026-02-01T19:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.869878 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.881232 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.897129 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.916264 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.929970 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b330f5dc-7057-4da0-8631-76683675a7a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.930030 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-564v9\" (UniqueName: \"kubernetes.io/projected/b330f5dc-7057-4da0-8631-76683675a7a6-kube-api-access-564v9\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.930145 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b330f5dc-7057-4da0-8631-76683675a7a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.930279 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b330f5dc-7057-4da0-8631-76683675a7a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.931466 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.943550 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.958604 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.959139 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.959174 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.959215 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.959235 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.959249 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:40Z","lastTransitionTime":"2026-02-01T19:34:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.971143 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:40 crc kubenswrapper[4961]: I0201 19:34:40.990249 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:40Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.011666 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.031747 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b330f5dc-7057-4da0-8631-76683675a7a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.031847 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-564v9\" (UniqueName: \"kubernetes.io/projected/b330f5dc-7057-4da0-8631-76683675a7a6-kube-api-access-564v9\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.031886 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b330f5dc-7057-4da0-8631-76683675a7a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.031923 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b330f5dc-7057-4da0-8631-76683675a7a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.033150 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b330f5dc-7057-4da0-8631-76683675a7a6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.033261 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b330f5dc-7057-4da0-8631-76683675a7a6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.034250 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.043834 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b330f5dc-7057-4da0-8631-76683675a7a6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.054276 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-564v9\" (UniqueName: \"kubernetes.io/projected/b330f5dc-7057-4da0-8631-76683675a7a6-kube-api-access-564v9\") pod \"ovnkube-control-plane-749d76644c-tpzkt\" (UID: \"b330f5dc-7057-4da0-8631-76683675a7a6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.061467 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.061531 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.061559 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.061592 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.061616 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:41Z","lastTransitionTime":"2026-02-01T19:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.076558 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e93d11419f730aff25814823ecc8b1451957952e591db50addd0eaf99db78b8d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:37Z\\\",\\\"message\\\":\\\"te (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0201 19:34:37.507709 6256 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0201 19:34:37.507732 6256 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0201 19:34:37.507761 6256 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0201 19:34:37.507840 6256 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0201 19:34:37.507850 6256 handler.go:208] Removed *v1.Node event handler 2\\\\nI0201 19:34:37.507877 6256 handler.go:208] Removed *v1.Node event handler 7\\\\nI0201 19:34:37.507895 6256 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0201 19:34:37.507903 6256 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0201 19:34:37.507895 6256 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0201 19:34:37.507935 6256 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0201 19:34:37.507949 6256 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0201 19:34:37.507961 6256 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0201 19:34:37.508196 6256 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0201 19:34:37.508256 6256 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:39Z\\\",\\\"message\\\":\\\"led to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z]\\\\nI0201 19:34:39.628993 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0201 19:34:39.628999 6384 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-c9pwd in node crc\\\\nI0201 19:34:39.629001 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629004 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629010 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629012 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629009 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-6dzl2\\\\nI0201 19:34:39.628971 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ng92q after 0 f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.096867 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.114693 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.164273 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.164317 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.164326 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.164342 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.164354 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:41Z","lastTransitionTime":"2026-02-01T19:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.209944 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 11:49:12.917454632 +0000 UTC Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.267607 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.267655 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.267673 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.267699 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.267718 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:41Z","lastTransitionTime":"2026-02-01T19:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.352296 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.370176 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.370224 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.370241 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.370264 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.370281 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:41Z","lastTransitionTime":"2026-02-01T19:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:41 crc kubenswrapper[4961]: W0201 19:34:41.371233 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb330f5dc_7057_4da0_8631_76683675a7a6.slice/crio-81e6ace20600637096bd4b142eeca9d7965e5420ecfdcd98d49f1375bc6069ef WatchSource:0}: Error finding container 81e6ace20600637096bd4b142eeca9d7965e5420ecfdcd98d49f1375bc6069ef: Status 404 returned error can't find the container with id 81e6ace20600637096bd4b142eeca9d7965e5420ecfdcd98d49f1375bc6069ef Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.474608 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.474680 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.474706 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.474737 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.474760 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:41Z","lastTransitionTime":"2026-02-01T19:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.565990 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" event={"ID":"b330f5dc-7057-4da0-8631-76683675a7a6","Type":"ContainerStarted","Data":"81e6ace20600637096bd4b142eeca9d7965e5420ecfdcd98d49f1375bc6069ef"} Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.568708 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/1.log" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.575161 4961 scope.go:117] "RemoveContainer" containerID="9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c" Feb 01 19:34:41 crc kubenswrapper[4961]: E0201 19:34:41.575467 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.578916 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.578959 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.578977 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.578998 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.579016 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:41Z","lastTransitionTime":"2026-02-01T19:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.596638 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.616388 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.636479 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.658531 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.681488 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.681539 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.681557 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.681582 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.681600 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:41Z","lastTransitionTime":"2026-02-01T19:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.701813 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:39Z\\\",\\\"message\\\":\\\"led to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z]\\\\nI0201 19:34:39.628993 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0201 19:34:39.628999 6384 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-c9pwd in node crc\\\\nI0201 19:34:39.629001 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629004 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629010 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629012 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629009 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-6dzl2\\\\nI0201 19:34:39.628971 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ng92q after 0 f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.721090 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.742362 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.767728 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.784318 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.784387 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.784407 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.784433 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.784453 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:41Z","lastTransitionTime":"2026-02-01T19:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.805605 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.827775 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.840903 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-j6fs5"] Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.841490 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:41 crc kubenswrapper[4961]: E0201 19:34:41.841566 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.845154 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.859944 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.877557 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.887651 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.887702 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.887717 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.887736 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.887750 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:41Z","lastTransitionTime":"2026-02-01T19:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.894146 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.909594 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.927509 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.940090 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.940130 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc2tt\" (UniqueName: \"kubernetes.io/projected/b35577aa-3e17-4907-80d9-b8911ea55c25-kube-api-access-jc2tt\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.955329 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.975610 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:41Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.990659 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.990722 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.990740 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.990765 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:41 crc kubenswrapper[4961]: I0201 19:34:41.990784 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:41Z","lastTransitionTime":"2026-02-01T19:34:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.002471 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.021563 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.035723 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.041695 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.041926 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:34:58.041876758 +0000 UTC m=+52.566908201 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.042011 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.042105 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc2tt\" (UniqueName: \"kubernetes.io/projected/b35577aa-3e17-4907-80d9-b8911ea55c25-kube-api-access-jc2tt\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.042171 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042186 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.042239 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042271 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs podName:b35577aa-3e17-4907-80d9-b8911ea55c25 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:42.542242269 +0000 UTC m=+37.067273692 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs") pod "network-metrics-daemon-j6fs5" (UID: "b35577aa-3e17-4907-80d9-b8911ea55c25") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042434 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042460 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042475 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.042566 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042592 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042707 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042734 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042606 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042631 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:58.042604748 +0000 UTC m=+52.567636211 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.042858 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042934 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:58.042921076 +0000 UTC m=+52.567952499 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.042961 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:58.042950367 +0000 UTC m=+52.567981790 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.043290 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.043513 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:58.04345818 +0000 UTC m=+52.568489643 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.051463 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.066834 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.068230 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc2tt\" (UniqueName: \"kubernetes.io/projected/b35577aa-3e17-4907-80d9-b8911ea55c25-kube-api-access-jc2tt\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.080144 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.092659 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.093072 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.093167 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.093228 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.093289 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.093342 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:42Z","lastTransitionTime":"2026-02-01T19:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.106733 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.119058 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.135118 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.150092 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.164665 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.180175 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.197076 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.197207 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.197266 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.197325 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.197405 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:42Z","lastTransitionTime":"2026-02-01T19:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.198741 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.210970 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 17:28:11.385725159 +0000 UTC Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.223520 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:39Z\\\",\\\"message\\\":\\\"led to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z]\\\\nI0201 19:34:39.628993 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0201 19:34:39.628999 6384 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-c9pwd in node crc\\\\nI0201 19:34:39.629001 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629004 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629010 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629012 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629009 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-6dzl2\\\\nI0201 19:34:39.628971 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ng92q after 0 f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.236348 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.236428 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.236448 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.236624 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.236732 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.236863 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.300240 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.300300 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.300318 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.300342 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.300359 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:42Z","lastTransitionTime":"2026-02-01T19:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.403503 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.403559 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.403579 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.403606 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.403635 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:42Z","lastTransitionTime":"2026-02-01T19:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.506428 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.506585 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.506603 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.506629 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.506693 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:42Z","lastTransitionTime":"2026-02-01T19:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.547684 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.547911 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: E0201 19:34:42.548103 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs podName:b35577aa-3e17-4907-80d9-b8911ea55c25 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:43.548069468 +0000 UTC m=+38.073100931 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs") pod "network-metrics-daemon-j6fs5" (UID: "b35577aa-3e17-4907-80d9-b8911ea55c25") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.583550 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" event={"ID":"b330f5dc-7057-4da0-8631-76683675a7a6","Type":"ContainerStarted","Data":"1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.583641 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" event={"ID":"b330f5dc-7057-4da0-8631-76683675a7a6","Type":"ContainerStarted","Data":"7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.610153 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.610233 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.610256 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.610288 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.610310 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:42Z","lastTransitionTime":"2026-02-01T19:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.618701 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.639522 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.661571 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.683358 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.705433 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.713965 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.714065 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.714088 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.714119 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.714137 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:42Z","lastTransitionTime":"2026-02-01T19:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.725745 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.745478 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.768802 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.786970 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.807589 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.817697 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.817751 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.817768 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.817795 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.817813 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:42Z","lastTransitionTime":"2026-02-01T19:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.825876 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.859635 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:39Z\\\",\\\"message\\\":\\\"led to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z]\\\\nI0201 19:34:39.628993 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0201 19:34:39.628999 6384 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-c9pwd in node crc\\\\nI0201 19:34:39.629001 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629004 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629010 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629012 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629009 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-6dzl2\\\\nI0201 19:34:39.628971 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ng92q after 0 f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.886610 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.908982 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.921423 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.921473 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.921489 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.921514 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.921534 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:42Z","lastTransitionTime":"2026-02-01T19:34:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.928371 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.934340 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.951334 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:42 crc kubenswrapper[4961]: I0201 19:34:42.972984 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.000276 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:42Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.021601 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.024109 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.024144 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.024172 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.024202 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.024223 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:43Z","lastTransitionTime":"2026-02-01T19:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.041538 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.060996 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.079842 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.099806 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.119552 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.126632 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.126692 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.126710 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.126729 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.126743 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:43Z","lastTransitionTime":"2026-02-01T19:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.139277 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.159422 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.183003 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:39Z\\\",\\\"message\\\":\\\"led to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z]\\\\nI0201 19:34:39.628993 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0201 19:34:39.628999 6384 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-c9pwd in node crc\\\\nI0201 19:34:39.629001 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629004 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629010 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629012 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629009 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-6dzl2\\\\nI0201 19:34:39.628971 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ng92q after 0 f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.200137 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.212360 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 11:25:37.17958539 +0000 UTC Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.215487 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.229382 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.229436 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.229452 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.229474 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.229489 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:43Z","lastTransitionTime":"2026-02-01T19:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.232554 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.234235 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:43 crc kubenswrapper[4961]: E0201 19:34:43.234406 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.264442 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.284703 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.305745 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.319314 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:43Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.332713 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.332750 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.332760 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.332776 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.332787 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:43Z","lastTransitionTime":"2026-02-01T19:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.435906 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.435957 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.435981 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.436004 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.436025 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:43Z","lastTransitionTime":"2026-02-01T19:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.537928 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.537968 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.537982 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.538000 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.538015 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:43Z","lastTransitionTime":"2026-02-01T19:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.557881 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:43 crc kubenswrapper[4961]: E0201 19:34:43.558094 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:43 crc kubenswrapper[4961]: E0201 19:34:43.558176 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs podName:b35577aa-3e17-4907-80d9-b8911ea55c25 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:45.558153667 +0000 UTC m=+40.083185130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs") pod "network-metrics-daemon-j6fs5" (UID: "b35577aa-3e17-4907-80d9-b8911ea55c25") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.640923 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.640977 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.640996 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.641025 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.641081 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:43Z","lastTransitionTime":"2026-02-01T19:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.744624 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.744703 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.744728 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.744760 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.744786 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:43Z","lastTransitionTime":"2026-02-01T19:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.846993 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.847087 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.847105 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.847129 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.847148 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:43Z","lastTransitionTime":"2026-02-01T19:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.950055 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.950090 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.950098 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.950111 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:43 crc kubenswrapper[4961]: I0201 19:34:43.950120 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:43Z","lastTransitionTime":"2026-02-01T19:34:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.052342 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.052379 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.052388 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.052406 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.052419 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:44Z","lastTransitionTime":"2026-02-01T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.155621 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.155666 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.155675 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.155694 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.155704 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:44Z","lastTransitionTime":"2026-02-01T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.212846 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 20:25:56.463697467 +0000 UTC Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.234421 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.234499 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:44 crc kubenswrapper[4961]: E0201 19:34:44.234664 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.234769 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:44 crc kubenswrapper[4961]: E0201 19:34:44.234899 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:44 crc kubenswrapper[4961]: E0201 19:34:44.235624 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.257928 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.258001 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.258023 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.258091 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.258116 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:44Z","lastTransitionTime":"2026-02-01T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.361367 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.361452 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.361476 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.361506 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.361542 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:44Z","lastTransitionTime":"2026-02-01T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.464929 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.464980 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.464990 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.465010 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.465022 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:44Z","lastTransitionTime":"2026-02-01T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.568966 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.569034 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.569064 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.569081 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.569095 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:44Z","lastTransitionTime":"2026-02-01T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.673465 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.673536 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.673559 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.673585 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.673605 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:44Z","lastTransitionTime":"2026-02-01T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.777406 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.777480 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.777503 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.777539 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.777561 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:44Z","lastTransitionTime":"2026-02-01T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.880436 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.880499 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.880518 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.880543 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.880560 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:44Z","lastTransitionTime":"2026-02-01T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.984031 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.984162 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.984179 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.984206 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:44 crc kubenswrapper[4961]: I0201 19:34:44.984224 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:44Z","lastTransitionTime":"2026-02-01T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.088010 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.088108 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.088127 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.088151 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.088169 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:45Z","lastTransitionTime":"2026-02-01T19:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.190709 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.190786 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.190804 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.190832 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.190857 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:45Z","lastTransitionTime":"2026-02-01T19:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.213436 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 04:44:44.911585415 +0000 UTC Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.234881 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:45 crc kubenswrapper[4961]: E0201 19:34:45.235111 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.293435 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.293524 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.293546 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.293576 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.293594 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:45Z","lastTransitionTime":"2026-02-01T19:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.395871 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.395918 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.395929 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.395950 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.395963 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:45Z","lastTransitionTime":"2026-02-01T19:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.498652 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.498706 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.498718 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.498736 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.498749 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:45Z","lastTransitionTime":"2026-02-01T19:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.578479 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:45 crc kubenswrapper[4961]: E0201 19:34:45.578675 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:45 crc kubenswrapper[4961]: E0201 19:34:45.578737 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs podName:b35577aa-3e17-4907-80d9-b8911ea55c25 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:49.578722364 +0000 UTC m=+44.103753797 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs") pod "network-metrics-daemon-j6fs5" (UID: "b35577aa-3e17-4907-80d9-b8911ea55c25") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.601581 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.601701 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.601724 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.601750 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.601768 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:45Z","lastTransitionTime":"2026-02-01T19:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.704891 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.704956 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.704973 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.705000 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.705019 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:45Z","lastTransitionTime":"2026-02-01T19:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.808143 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.808194 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.808206 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.808224 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.808238 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:45Z","lastTransitionTime":"2026-02-01T19:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.911480 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.911545 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.911564 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.911590 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:45 crc kubenswrapper[4961]: I0201 19:34:45.911609 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:45Z","lastTransitionTime":"2026-02-01T19:34:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.014377 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.014439 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.014471 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.014501 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.014520 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.117519 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.117577 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.117590 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.117609 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.117622 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.167118 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.167197 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.167220 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.167251 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.167275 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: E0201 19:34:46.188614 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.193168 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.193225 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.193239 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.193257 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.193269 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: E0201 19:34:46.212439 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.214716 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 00:15:29.471693286 +0000 UTC Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.217295 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.217347 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.217363 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.217386 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.217403 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.234169 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.234225 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.234297 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:46 crc kubenswrapper[4961]: E0201 19:34:46.234337 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:46 crc kubenswrapper[4961]: E0201 19:34:46.234646 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:46 crc kubenswrapper[4961]: E0201 19:34:46.234802 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:46 crc kubenswrapper[4961]: E0201 19:34:46.236727 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.243118 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.243182 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.243207 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.243240 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.243331 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.258580 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: E0201 19:34:46.264351 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.269841 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.270080 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.270208 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.270341 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.270484 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.281148 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: E0201 19:34:46.290806 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: E0201 19:34:46.290914 4961 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.293666 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.293707 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.293728 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.293752 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.293770 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.299853 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.314975 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.336622 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.356725 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.379566 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.396853 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.396932 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.396955 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.396984 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.397007 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.400303 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.420133 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.452632 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:39Z\\\",\\\"message\\\":\\\"led to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z]\\\\nI0201 19:34:39.628993 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0201 19:34:39.628999 6384 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-c9pwd in node crc\\\\nI0201 19:34:39.629001 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629004 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629010 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629012 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629009 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-6dzl2\\\\nI0201 19:34:39.628971 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ng92q after 0 f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.476113 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.495647 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.499986 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.500215 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.500380 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.500522 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.500647 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.521192 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.553216 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.573109 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.591093 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.602950 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.602991 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.603003 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.603022 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.603068 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.606798 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:46Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.706024 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.706103 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.706118 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.706136 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.706147 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.808410 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.808444 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.808456 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.808472 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.808482 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.911403 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.911473 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.911483 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.911495 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:46 crc kubenswrapper[4961]: I0201 19:34:46.911504 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:46Z","lastTransitionTime":"2026-02-01T19:34:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.014207 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.014366 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.014379 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.014396 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.014407 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:47Z","lastTransitionTime":"2026-02-01T19:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.116824 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.116866 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.116881 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.116901 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.116915 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:47Z","lastTransitionTime":"2026-02-01T19:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.214968 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 10:03:21.276075611 +0000 UTC Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.219363 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.219423 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.219439 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.219464 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.219480 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:47Z","lastTransitionTime":"2026-02-01T19:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.234769 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:47 crc kubenswrapper[4961]: E0201 19:34:47.234956 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.322066 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.322126 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.322139 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.322159 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.322171 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:47Z","lastTransitionTime":"2026-02-01T19:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.445865 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.446365 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.446494 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.446616 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.446751 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:47Z","lastTransitionTime":"2026-02-01T19:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.552788 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.552855 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.552928 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.552953 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.552966 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:47Z","lastTransitionTime":"2026-02-01T19:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.655235 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.655641 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.655802 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.655942 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.656101 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:47Z","lastTransitionTime":"2026-02-01T19:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.759889 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.759964 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.759983 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.760011 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.760064 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:47Z","lastTransitionTime":"2026-02-01T19:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.862832 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.862876 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.862888 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.862903 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.862914 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:47Z","lastTransitionTime":"2026-02-01T19:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.965518 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.965565 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.965577 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.965595 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:47 crc kubenswrapper[4961]: I0201 19:34:47.965641 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:47Z","lastTransitionTime":"2026-02-01T19:34:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.068222 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.068263 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.068271 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.068283 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.068291 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:48Z","lastTransitionTime":"2026-02-01T19:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.171236 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.171310 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.171328 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.171354 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.171374 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:48Z","lastTransitionTime":"2026-02-01T19:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.215438 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 10:06:14.846678523 +0000 UTC Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.234849 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.234886 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.234909 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:48 crc kubenswrapper[4961]: E0201 19:34:48.235088 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:48 crc kubenswrapper[4961]: E0201 19:34:48.235201 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:48 crc kubenswrapper[4961]: E0201 19:34:48.235470 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.273596 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.273648 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.273664 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.273684 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.273723 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:48Z","lastTransitionTime":"2026-02-01T19:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.377378 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.377547 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.377587 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.377618 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.377639 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:48Z","lastTransitionTime":"2026-02-01T19:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.481183 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.481280 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.481298 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.481322 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.481370 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:48Z","lastTransitionTime":"2026-02-01T19:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.585209 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.585311 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.585393 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.585482 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.585516 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:48Z","lastTransitionTime":"2026-02-01T19:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.689595 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.689689 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.689715 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.689745 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.689765 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:48Z","lastTransitionTime":"2026-02-01T19:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.793571 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.793638 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.793661 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.793695 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.793721 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:48Z","lastTransitionTime":"2026-02-01T19:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.897750 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.897809 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.897832 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.897861 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:48 crc kubenswrapper[4961]: I0201 19:34:48.897882 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:48Z","lastTransitionTime":"2026-02-01T19:34:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.000895 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.000966 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.000986 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.001011 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.001030 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:49Z","lastTransitionTime":"2026-02-01T19:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.105545 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.105624 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.105642 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.105670 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.105690 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:49Z","lastTransitionTime":"2026-02-01T19:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.208700 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.208770 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.208789 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.208817 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.208840 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:49Z","lastTransitionTime":"2026-02-01T19:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.216217 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 10:02:51.991116963 +0000 UTC Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.234887 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:49 crc kubenswrapper[4961]: E0201 19:34:49.235239 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.312374 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.312453 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.312473 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.312504 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.312526 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:49Z","lastTransitionTime":"2026-02-01T19:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.416086 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.416144 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.416159 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.416180 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.416197 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:49Z","lastTransitionTime":"2026-02-01T19:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.520569 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.520635 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.520654 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.520682 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.520701 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:49Z","lastTransitionTime":"2026-02-01T19:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.620096 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:49 crc kubenswrapper[4961]: E0201 19:34:49.620250 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:49 crc kubenswrapper[4961]: E0201 19:34:49.620361 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs podName:b35577aa-3e17-4907-80d9-b8911ea55c25 nodeName:}" failed. No retries permitted until 2026-02-01 19:34:57.620335622 +0000 UTC m=+52.145367055 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs") pod "network-metrics-daemon-j6fs5" (UID: "b35577aa-3e17-4907-80d9-b8911ea55c25") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.623472 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.623523 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.623540 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.623564 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.623581 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:49Z","lastTransitionTime":"2026-02-01T19:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.726237 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.726307 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.726325 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.726351 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.726369 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:49Z","lastTransitionTime":"2026-02-01T19:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.829362 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.829435 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.829453 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.829478 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.829499 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:49Z","lastTransitionTime":"2026-02-01T19:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.932568 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.932617 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.932632 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.932653 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:49 crc kubenswrapper[4961]: I0201 19:34:49.932668 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:49Z","lastTransitionTime":"2026-02-01T19:34:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.035354 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.035407 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.035424 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.035448 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.035468 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:50Z","lastTransitionTime":"2026-02-01T19:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.139020 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.139123 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.139141 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.139171 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.139190 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:50Z","lastTransitionTime":"2026-02-01T19:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.217263 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 08:29:12.960700863 +0000 UTC Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.235211 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.235250 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.235356 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:50 crc kubenswrapper[4961]: E0201 19:34:50.235420 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:50 crc kubenswrapper[4961]: E0201 19:34:50.235584 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:50 crc kubenswrapper[4961]: E0201 19:34:50.236262 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.247749 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.247805 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.247835 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.247863 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.247894 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:50Z","lastTransitionTime":"2026-02-01T19:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.351811 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.351893 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.351910 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.351940 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.351958 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:50Z","lastTransitionTime":"2026-02-01T19:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.454157 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.454644 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.454834 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.454990 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.455195 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:50Z","lastTransitionTime":"2026-02-01T19:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.558891 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.558951 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.558963 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.558984 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.558998 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:50Z","lastTransitionTime":"2026-02-01T19:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.662691 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.662753 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.662770 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.662795 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.662812 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:50Z","lastTransitionTime":"2026-02-01T19:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.765207 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.765288 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.765303 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.765327 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.765341 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:50Z","lastTransitionTime":"2026-02-01T19:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.868982 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.869077 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.869091 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.869111 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.869123 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:50Z","lastTransitionTime":"2026-02-01T19:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.972446 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.972486 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.972498 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.972515 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:50 crc kubenswrapper[4961]: I0201 19:34:50.972526 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:50Z","lastTransitionTime":"2026-02-01T19:34:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.075547 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.075599 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.075613 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.075638 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.075656 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:51Z","lastTransitionTime":"2026-02-01T19:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.179079 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.179128 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.179143 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.179169 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.179183 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:51Z","lastTransitionTime":"2026-02-01T19:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.218427 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 15:29:27.173200057 +0000 UTC Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.234796 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:51 crc kubenswrapper[4961]: E0201 19:34:51.234986 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.281709 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.281767 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.281785 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.281810 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.281828 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:51Z","lastTransitionTime":"2026-02-01T19:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.383949 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.384100 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.384126 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.384157 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.384180 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:51Z","lastTransitionTime":"2026-02-01T19:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.486411 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.486463 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.486479 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.486499 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.486513 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:51Z","lastTransitionTime":"2026-02-01T19:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.590245 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.590300 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.590316 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.590334 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.590346 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:51Z","lastTransitionTime":"2026-02-01T19:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.693885 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.693924 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.693936 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.693952 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.693964 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:51Z","lastTransitionTime":"2026-02-01T19:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.796880 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.796957 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.796997 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.797023 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.797064 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:51Z","lastTransitionTime":"2026-02-01T19:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.899680 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.899751 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.899764 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.899784 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:51 crc kubenswrapper[4961]: I0201 19:34:51.899819 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:51Z","lastTransitionTime":"2026-02-01T19:34:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.002809 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.002855 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.002867 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.002885 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.002898 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:52Z","lastTransitionTime":"2026-02-01T19:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.106321 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.106377 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.106395 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.106418 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.106433 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:52Z","lastTransitionTime":"2026-02-01T19:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.210223 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.210316 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.210335 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.210368 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.210387 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:52Z","lastTransitionTime":"2026-02-01T19:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.218815 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 10:08:20.823579656 +0000 UTC Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.235203 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.235316 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:52 crc kubenswrapper[4961]: E0201 19:34:52.235414 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:52 crc kubenswrapper[4961]: E0201 19:34:52.235518 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.235603 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:52 crc kubenswrapper[4961]: E0201 19:34:52.235814 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.312820 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.312861 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.312876 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.312896 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.312908 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:52Z","lastTransitionTime":"2026-02-01T19:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.416081 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.416119 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.416132 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.416149 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.416162 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:52Z","lastTransitionTime":"2026-02-01T19:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.518847 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.518914 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.518937 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.518967 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.518989 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:52Z","lastTransitionTime":"2026-02-01T19:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.623085 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.623450 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.623467 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.623492 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.623509 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:52Z","lastTransitionTime":"2026-02-01T19:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.726715 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.726766 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.726780 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.726800 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.726816 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:52Z","lastTransitionTime":"2026-02-01T19:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.829115 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.829163 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.829188 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.829207 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.829219 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:52Z","lastTransitionTime":"2026-02-01T19:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.932361 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.932386 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.932394 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.932408 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:52 crc kubenswrapper[4961]: I0201 19:34:52.932418 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:52Z","lastTransitionTime":"2026-02-01T19:34:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.034796 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.035256 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.035468 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.035883 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.036154 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:53Z","lastTransitionTime":"2026-02-01T19:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.139298 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.139580 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.139646 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.139741 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.139831 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:53Z","lastTransitionTime":"2026-02-01T19:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.219329 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 21:03:58.211558562 +0000 UTC Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.234779 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:53 crc kubenswrapper[4961]: E0201 19:34:53.235004 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.243640 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.243718 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.243737 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.243763 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.243781 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:53Z","lastTransitionTime":"2026-02-01T19:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.346631 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.346697 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.346718 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.346748 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.346783 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:53Z","lastTransitionTime":"2026-02-01T19:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.448943 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.449010 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.449116 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.449152 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.449177 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:53Z","lastTransitionTime":"2026-02-01T19:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.552104 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.552152 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.552164 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.552186 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.552199 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:53Z","lastTransitionTime":"2026-02-01T19:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.655096 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.655132 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.655139 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.655153 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.655163 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:53Z","lastTransitionTime":"2026-02-01T19:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.758386 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.758468 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.758510 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.758543 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.758565 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:53Z","lastTransitionTime":"2026-02-01T19:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.861436 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.861486 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.861496 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.861510 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.861520 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:53Z","lastTransitionTime":"2026-02-01T19:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.872182 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.881718 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.893976 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:53Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.908180 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:53Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.923511 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:53Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.942816 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:53Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.956111 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:53Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.963164 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.963212 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.963223 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.963243 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.963256 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:53Z","lastTransitionTime":"2026-02-01T19:34:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.968442 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:53Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.982437 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:53Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:53 crc kubenswrapper[4961]: I0201 19:34:53.993785 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:53Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.006919 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:54Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.016570 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:54Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.029494 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:54Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.043403 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:54Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.054922 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:54Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.065731 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.065791 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.065808 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.065836 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.065855 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:54Z","lastTransitionTime":"2026-02-01T19:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.082248 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:39Z\\\",\\\"message\\\":\\\"led to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z]\\\\nI0201 19:34:39.628993 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0201 19:34:39.628999 6384 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-c9pwd in node crc\\\\nI0201 19:34:39.629001 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629004 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629010 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629012 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629009 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-6dzl2\\\\nI0201 19:34:39.628971 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ng92q after 0 f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:54Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.102945 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:54Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.114600 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:54Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.130592 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:54Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.167836 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.167900 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.167918 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.167945 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.167965 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:54Z","lastTransitionTime":"2026-02-01T19:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.220497 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 14:40:04.607044817 +0000 UTC Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.234900 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.234955 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:54 crc kubenswrapper[4961]: E0201 19:34:54.235121 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.235180 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:54 crc kubenswrapper[4961]: E0201 19:34:54.235250 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:54 crc kubenswrapper[4961]: E0201 19:34:54.235389 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.275401 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.276184 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.276214 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.276241 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.276260 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:54Z","lastTransitionTime":"2026-02-01T19:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.379108 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.379180 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.379197 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.379222 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.379240 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:54Z","lastTransitionTime":"2026-02-01T19:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.482449 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.482536 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.482554 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.482584 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.482607 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:54Z","lastTransitionTime":"2026-02-01T19:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.585785 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.585841 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.585849 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.585865 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.585875 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:54Z","lastTransitionTime":"2026-02-01T19:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.688474 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.688523 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.688535 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.688554 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.688567 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:54Z","lastTransitionTime":"2026-02-01T19:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.791912 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.792009 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.792024 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.792069 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.792087 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:54Z","lastTransitionTime":"2026-02-01T19:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.895111 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.895162 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.895174 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.895194 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.895206 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:54Z","lastTransitionTime":"2026-02-01T19:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.997362 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.997445 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.997471 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.997507 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:54 crc kubenswrapper[4961]: I0201 19:34:54.997534 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:54Z","lastTransitionTime":"2026-02-01T19:34:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.100943 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.101018 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.101061 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.101089 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.101109 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:55Z","lastTransitionTime":"2026-02-01T19:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.203894 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.203979 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.203998 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.204029 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.204088 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:55Z","lastTransitionTime":"2026-02-01T19:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.220730 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 04:01:57.595308662 +0000 UTC Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.234730 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:55 crc kubenswrapper[4961]: E0201 19:34:55.234916 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.306390 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.306448 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.306463 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.306489 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.306503 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:55Z","lastTransitionTime":"2026-02-01T19:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.408860 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.408932 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.408952 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.408974 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.408993 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:55Z","lastTransitionTime":"2026-02-01T19:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.511996 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.512075 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.512085 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.512101 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.512111 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:55Z","lastTransitionTime":"2026-02-01T19:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.615261 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.615333 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.615350 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.615378 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.615398 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:55Z","lastTransitionTime":"2026-02-01T19:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.718463 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.718515 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.718533 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.718556 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.718571 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:55Z","lastTransitionTime":"2026-02-01T19:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.822278 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.822352 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.822371 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.822399 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.822423 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:55Z","lastTransitionTime":"2026-02-01T19:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.924909 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.924956 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.924965 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.924981 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:55 crc kubenswrapper[4961]: I0201 19:34:55.924991 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:55Z","lastTransitionTime":"2026-02-01T19:34:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.027281 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.027339 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.027353 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.027372 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.027385 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.129551 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.129596 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.129607 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.129625 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.129634 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.221978 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 19:24:43.822105319 +0000 UTC Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.231862 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.231904 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.231913 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.231929 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.231939 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.234343 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.234502 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.234510 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:56 crc kubenswrapper[4961]: E0201 19:34:56.234840 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:56 crc kubenswrapper[4961]: E0201 19:34:56.234950 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:56 crc kubenswrapper[4961]: E0201 19:34:56.235828 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.237111 4961 scope.go:117] "RemoveContainer" containerID="9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.253960 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.269405 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.289524 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.303008 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.321544 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.334093 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.334167 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.334179 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.334202 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.334234 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.342299 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.361703 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.370531 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.370589 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.370601 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.370621 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.370637 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.385491 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: E0201 19:34:56.391423 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.397303 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.397386 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.397409 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.397438 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.397469 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.402932 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: E0201 19:34:56.420220 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.421998 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.425012 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.425083 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.425098 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.425122 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.425139 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: E0201 19:34:56.438133 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.442011 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.442078 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.442096 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.442120 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.442136 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.443811 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:39Z\\\",\\\"message\\\":\\\"led to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z]\\\\nI0201 19:34:39.628993 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0201 19:34:39.628999 6384 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-c9pwd in node crc\\\\nI0201 19:34:39.629001 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629004 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629010 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629012 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629009 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-6dzl2\\\\nI0201 19:34:39.628971 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ng92q after 0 f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: E0201 19:34:56.454364 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.456165 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.457986 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.458061 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.458077 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.458100 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.458114 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.469936 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: E0201 19:34:56.470125 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: E0201 19:34:56.470291 4961 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.472570 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.472628 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.472640 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.472658 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.472668 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.486547 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.506155 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.519477 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.531192 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.540368 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.580756 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.580815 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.580829 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.580843 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.580852 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.634346 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/1.log" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.637202 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.637726 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.662952 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.683866 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.683921 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.683934 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.683955 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.683967 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.688998 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.703431 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.716524 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.729264 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.739290 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.748969 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.761382 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.773584 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.783476 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.785840 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.785874 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.785884 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.785905 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.785917 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.794109 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.806104 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.819974 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.844804 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:39Z\\\",\\\"message\\\":\\\"led to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z]\\\\nI0201 19:34:39.628993 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0201 19:34:39.628999 6384 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-c9pwd in node crc\\\\nI0201 19:34:39.629001 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629004 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629010 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629012 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629009 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-6dzl2\\\\nI0201 19:34:39.628971 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ng92q after 0 f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.860975 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.874908 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.887739 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.887789 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.887801 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.887818 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.887829 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.888335 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.900320 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:56Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.990213 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.990254 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.990264 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.990280 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:56 crc kubenswrapper[4961]: I0201 19:34:56.990291 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:56Z","lastTransitionTime":"2026-02-01T19:34:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.092447 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.092488 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.092500 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.092517 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.092546 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:57Z","lastTransitionTime":"2026-02-01T19:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.194472 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.194531 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.194548 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.194572 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.194588 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:57Z","lastTransitionTime":"2026-02-01T19:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.222293 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 21:21:01.323866558 +0000 UTC Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.234440 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:57 crc kubenswrapper[4961]: E0201 19:34:57.234659 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.296943 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.297015 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.297032 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.297086 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.297104 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:57Z","lastTransitionTime":"2026-02-01T19:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.399628 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.399695 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.399717 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.399749 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.399766 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:57Z","lastTransitionTime":"2026-02-01T19:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.503152 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.503214 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.503232 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.503260 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.503280 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:57Z","lastTransitionTime":"2026-02-01T19:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.606423 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.606476 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.606493 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.606519 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.606536 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:57Z","lastTransitionTime":"2026-02-01T19:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.644794 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/2.log" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.645933 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/1.log" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.650779 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf" exitCode=1 Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.650847 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf"} Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.650911 4961 scope.go:117] "RemoveContainer" containerID="9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.652093 4961 scope.go:117] "RemoveContainer" containerID="5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf" Feb 01 19:34:57 crc kubenswrapper[4961]: E0201 19:34:57.652353 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.671547 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.696866 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.710259 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.710364 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.710389 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.710423 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.710445 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:57Z","lastTransitionTime":"2026-02-01T19:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.716663 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.719582 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:57 crc kubenswrapper[4961]: E0201 19:34:57.719876 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:57 crc kubenswrapper[4961]: E0201 19:34:57.719999 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs podName:b35577aa-3e17-4907-80d9-b8911ea55c25 nodeName:}" failed. No retries permitted until 2026-02-01 19:35:13.719961223 +0000 UTC m=+68.244992676 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs") pod "network-metrics-daemon-j6fs5" (UID: "b35577aa-3e17-4907-80d9-b8911ea55c25") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.736013 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.752953 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.771780 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.792087 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.807328 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.814004 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.814106 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.814125 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.814155 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.814176 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:57Z","lastTransitionTime":"2026-02-01T19:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.826559 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.849186 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.869932 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.901505 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9dc5f08fa216303c48479be48878c1d9c1b13180ff5b222831e615d3e81d276c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:39Z\\\",\\\"message\\\":\\\"led to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:39Z is after 2025-08-24T17:21:41Z]\\\\nI0201 19:34:39.628993 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0201 19:34:39.628999 6384 ovn.go:134] Ensuring zone local for Pod openshift-dns/node-resolver-c9pwd in node crc\\\\nI0201 19:34:39.629001 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629004 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629010 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0201 19:34:39.629012 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0201 19:34:39.629009 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-6dzl2\\\\nI0201 19:34:39.628971 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-ng92q after 0 f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:57Z\\\",\\\"message\\\":\\\"k-target-xd92c\\\\nI0201 19:34:57.054941 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5pnzb\\\\nI0201 19:34:57.054954 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055026 6612 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055095 6612 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nF0201 19:34:57.055102 6612 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.917225 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.917295 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.917312 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.917339 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.917361 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:57Z","lastTransitionTime":"2026-02-01T19:34:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.924182 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.947574 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:57 crc kubenswrapper[4961]: I0201 19:34:57.968229 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.002244 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.020734 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.020813 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.020835 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.020860 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.020878 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:58Z","lastTransitionTime":"2026-02-01T19:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.023932 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.049136 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.123845 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.124131 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.124179 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124210 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:35:30.124172317 +0000 UTC m=+84.649203780 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.124285 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.124364 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124364 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.124424 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.124500 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.124532 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124446 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124588 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124599 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124621 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124653 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:35:30.124638869 +0000 UTC m=+84.649670332 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124697 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 19:35:30.124669849 +0000 UTC m=+84.649701302 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124468 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.124572 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124728 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.124748 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:58Z","lastTransitionTime":"2026-02-01T19:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124772 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 19:35:30.124757032 +0000 UTC m=+84.649788495 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124553 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.124948 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:35:30.124933466 +0000 UTC m=+84.649964919 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.222838 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 10:35:33.51280099 +0000 UTC Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.227238 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.227288 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.227307 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.227334 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.227352 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:58Z","lastTransitionTime":"2026-02-01T19:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.234875 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.234896 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.235011 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.235162 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.235301 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.235410 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.330223 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.330322 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.330342 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.330375 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.330402 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:58Z","lastTransitionTime":"2026-02-01T19:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.434737 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.434893 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.434922 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.434956 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.434980 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:58Z","lastTransitionTime":"2026-02-01T19:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.537865 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.537933 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.537954 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.537980 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.537999 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:58Z","lastTransitionTime":"2026-02-01T19:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.640691 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.640788 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.640802 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.640821 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.640834 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:58Z","lastTransitionTime":"2026-02-01T19:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.657570 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/2.log" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.663426 4961 scope.go:117] "RemoveContainer" containerID="5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf" Feb 01 19:34:58 crc kubenswrapper[4961]: E0201 19:34:58.663717 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.690474 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.714513 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.736004 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.744115 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.744179 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.744203 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.744237 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.744264 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:58Z","lastTransitionTime":"2026-02-01T19:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.758016 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.779639 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.816489 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.849330 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.849409 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.849428 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.849460 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.849481 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:58Z","lastTransitionTime":"2026-02-01T19:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.855122 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:57Z\\\",\\\"message\\\":\\\"k-target-xd92c\\\\nI0201 19:34:57.054941 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5pnzb\\\\nI0201 19:34:57.054954 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055026 6612 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055095 6612 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nF0201 19:34:57.055102 6612 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.890703 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.910955 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.934108 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.953797 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.953879 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.953897 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.953927 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.953947 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:58Z","lastTransitionTime":"2026-02-01T19:34:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.958628 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.975265 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:58 crc kubenswrapper[4961]: I0201 19:34:58.995780 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:58Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.014577 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:59Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.046505 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:59Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.056731 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.056818 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.056840 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.056872 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.056893 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:59Z","lastTransitionTime":"2026-02-01T19:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.067232 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:59Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.092788 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:59Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.113103 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:59Z is after 2025-08-24T17:21:41Z" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.160607 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.160676 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.160694 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.160721 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.160740 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:59Z","lastTransitionTime":"2026-02-01T19:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.223391 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 05:14:29.758813065 +0000 UTC Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.234788 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:34:59 crc kubenswrapper[4961]: E0201 19:34:59.235031 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.264119 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.264179 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.264197 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.264226 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.264245 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:59Z","lastTransitionTime":"2026-02-01T19:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.366937 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.366977 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.367007 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.367025 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.367053 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:59Z","lastTransitionTime":"2026-02-01T19:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.470309 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.470392 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.470412 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.470443 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.470467 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:59Z","lastTransitionTime":"2026-02-01T19:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.573539 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.573599 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.573613 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.573633 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.573644 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:59Z","lastTransitionTime":"2026-02-01T19:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.676453 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.676518 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.676534 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.676562 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.676582 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:59Z","lastTransitionTime":"2026-02-01T19:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.784634 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.784764 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.784787 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.784822 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.784849 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:59Z","lastTransitionTime":"2026-02-01T19:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.889536 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.889594 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.889612 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.889639 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.889657 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:59Z","lastTransitionTime":"2026-02-01T19:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.992650 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.992696 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.992707 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.992723 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:34:59 crc kubenswrapper[4961]: I0201 19:34:59.992734 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:34:59Z","lastTransitionTime":"2026-02-01T19:34:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.095682 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.095791 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.095808 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.095835 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.095853 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:00Z","lastTransitionTime":"2026-02-01T19:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.199455 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.199513 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.199532 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.199561 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.199580 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:00Z","lastTransitionTime":"2026-02-01T19:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.224014 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 22:15:03.517887892 +0000 UTC Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.234714 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.234755 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:00 crc kubenswrapper[4961]: E0201 19:35:00.234878 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.234999 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:00 crc kubenswrapper[4961]: E0201 19:35:00.235066 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:00 crc kubenswrapper[4961]: E0201 19:35:00.235273 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.302757 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.302833 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.302850 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.302876 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.302895 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:00Z","lastTransitionTime":"2026-02-01T19:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.406116 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.406202 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.406227 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.406264 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.406289 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:00Z","lastTransitionTime":"2026-02-01T19:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.510087 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.510130 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.510148 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.510167 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.510183 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:00Z","lastTransitionTime":"2026-02-01T19:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.612828 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.612902 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.612926 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.612958 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.612978 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:00Z","lastTransitionTime":"2026-02-01T19:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.716763 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.716828 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.716846 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.716872 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.716893 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:00Z","lastTransitionTime":"2026-02-01T19:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.820312 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.820369 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.820422 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.820448 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.820466 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:00Z","lastTransitionTime":"2026-02-01T19:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.923274 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.923322 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.923334 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.923355 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:00 crc kubenswrapper[4961]: I0201 19:35:00.923367 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:00Z","lastTransitionTime":"2026-02-01T19:35:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.026004 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.026079 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.026091 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.026110 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.026124 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:01Z","lastTransitionTime":"2026-02-01T19:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.128741 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.128806 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.128822 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.128848 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.128869 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:01Z","lastTransitionTime":"2026-02-01T19:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.225000 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 17:54:14.157248302 +0000 UTC Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.230865 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.230914 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.230924 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.230952 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.230963 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:01Z","lastTransitionTime":"2026-02-01T19:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.234164 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:01 crc kubenswrapper[4961]: E0201 19:35:01.234333 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.334677 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.334710 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.334720 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.334736 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.334749 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:01Z","lastTransitionTime":"2026-02-01T19:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.438086 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.438154 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.438170 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.438199 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.438222 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:01Z","lastTransitionTime":"2026-02-01T19:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.541013 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.541140 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.541158 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.541185 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.541207 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:01Z","lastTransitionTime":"2026-02-01T19:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.649736 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.649793 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.649810 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.649840 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.649858 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:01Z","lastTransitionTime":"2026-02-01T19:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.752833 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.752911 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.752928 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.753587 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.753645 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:01Z","lastTransitionTime":"2026-02-01T19:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.856108 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.856169 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.856189 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.856218 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.856236 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:01Z","lastTransitionTime":"2026-02-01T19:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.958843 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.958929 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.958952 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.958981 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:01 crc kubenswrapper[4961]: I0201 19:35:01.958999 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:01Z","lastTransitionTime":"2026-02-01T19:35:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.061923 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.062324 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.062475 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.062622 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.062756 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:02Z","lastTransitionTime":"2026-02-01T19:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.165889 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.165959 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.165978 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.166006 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.166025 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:02Z","lastTransitionTime":"2026-02-01T19:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.225389 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 13:47:11.845369867 +0000 UTC Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.234787 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.234831 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:02 crc kubenswrapper[4961]: E0201 19:35:02.234974 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.235011 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:02 crc kubenswrapper[4961]: E0201 19:35:02.235225 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:02 crc kubenswrapper[4961]: E0201 19:35:02.235379 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.269066 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.269372 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.269456 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.269553 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.269643 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:02Z","lastTransitionTime":"2026-02-01T19:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.371889 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.371945 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.371964 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.371989 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.372006 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:02Z","lastTransitionTime":"2026-02-01T19:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.475636 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.475702 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.475722 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.475750 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.475769 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:02Z","lastTransitionTime":"2026-02-01T19:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.579518 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.579589 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.579606 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.579634 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.579651 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:02Z","lastTransitionTime":"2026-02-01T19:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.682631 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.682694 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.682710 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.682735 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.682752 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:02Z","lastTransitionTime":"2026-02-01T19:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.786280 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.786373 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.786396 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.786431 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.786456 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:02Z","lastTransitionTime":"2026-02-01T19:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.890567 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.890671 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.890688 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.890712 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.890728 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:02Z","lastTransitionTime":"2026-02-01T19:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.994244 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.994343 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.994362 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.994389 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:02 crc kubenswrapper[4961]: I0201 19:35:02.994410 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:02Z","lastTransitionTime":"2026-02-01T19:35:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.097332 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.097397 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.097414 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.097441 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.097459 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:03Z","lastTransitionTime":"2026-02-01T19:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.200996 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.201083 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.201102 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.201128 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.201145 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:03Z","lastTransitionTime":"2026-02-01T19:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.226568 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 00:38:27.318373022 +0000 UTC Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.235072 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:03 crc kubenswrapper[4961]: E0201 19:35:03.235347 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.304329 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.304404 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.304423 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.304454 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.304477 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:03Z","lastTransitionTime":"2026-02-01T19:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.407470 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.407531 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.407550 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.407576 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.407595 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:03Z","lastTransitionTime":"2026-02-01T19:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.510083 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.510207 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.510228 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.510255 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.510275 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:03Z","lastTransitionTime":"2026-02-01T19:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.613797 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.613873 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.613891 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.613921 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.613943 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:03Z","lastTransitionTime":"2026-02-01T19:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.717237 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.717315 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.717332 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.717358 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.717378 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:03Z","lastTransitionTime":"2026-02-01T19:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.820536 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.820607 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.820624 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.820652 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.820672 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:03Z","lastTransitionTime":"2026-02-01T19:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.923638 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.923708 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.923727 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.923749 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:03 crc kubenswrapper[4961]: I0201 19:35:03.923828 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:03Z","lastTransitionTime":"2026-02-01T19:35:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.026160 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.026239 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.026257 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.026283 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.026302 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:04Z","lastTransitionTime":"2026-02-01T19:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.129122 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.129181 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.129200 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.129226 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.129245 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:04Z","lastTransitionTime":"2026-02-01T19:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.227236 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 09:34:19.389577856 +0000 UTC Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.232241 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.232453 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.232574 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.232729 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.232846 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:04Z","lastTransitionTime":"2026-02-01T19:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.234842 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.235586 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.235658 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:04 crc kubenswrapper[4961]: E0201 19:35:04.235856 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:04 crc kubenswrapper[4961]: E0201 19:35:04.235987 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:04 crc kubenswrapper[4961]: E0201 19:35:04.236181 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.336266 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.336328 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.336347 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.336377 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.336395 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:04Z","lastTransitionTime":"2026-02-01T19:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.439966 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.440334 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.440473 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.440601 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.440768 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:04Z","lastTransitionTime":"2026-02-01T19:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.544850 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.544909 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.544932 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.544957 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.544975 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:04Z","lastTransitionTime":"2026-02-01T19:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.648168 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.648259 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.648284 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.648324 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.648350 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:04Z","lastTransitionTime":"2026-02-01T19:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.751929 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.751996 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.752014 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.752066 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.752085 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:04Z","lastTransitionTime":"2026-02-01T19:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.854983 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.855087 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.855107 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.855135 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.855154 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:04Z","lastTransitionTime":"2026-02-01T19:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.958294 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.958365 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.958385 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.958411 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:04 crc kubenswrapper[4961]: I0201 19:35:04.958429 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:04Z","lastTransitionTime":"2026-02-01T19:35:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.061200 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.061244 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.061256 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.061277 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.061288 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:05Z","lastTransitionTime":"2026-02-01T19:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.164892 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.164944 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.164962 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.164990 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.165009 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:05Z","lastTransitionTime":"2026-02-01T19:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.228169 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 11:37:49.972754789 +0000 UTC Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.235202 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:05 crc kubenswrapper[4961]: E0201 19:35:05.235543 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.267674 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.267740 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.267765 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.267792 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.267809 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:05Z","lastTransitionTime":"2026-02-01T19:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.371448 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.371503 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.371526 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.371558 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.371581 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:05Z","lastTransitionTime":"2026-02-01T19:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.475012 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.475103 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.475122 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.475148 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.475166 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:05Z","lastTransitionTime":"2026-02-01T19:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.578706 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.578808 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.578833 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.578866 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.578886 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:05Z","lastTransitionTime":"2026-02-01T19:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.682597 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.682659 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.682681 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.682710 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.682729 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:05Z","lastTransitionTime":"2026-02-01T19:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.785729 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.785785 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.785804 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.785830 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.785850 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:05Z","lastTransitionTime":"2026-02-01T19:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.889435 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.889511 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.889529 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.889562 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.889583 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:05Z","lastTransitionTime":"2026-02-01T19:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.993675 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.993754 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.993776 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.993807 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:05 crc kubenswrapper[4961]: I0201 19:35:05.993827 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:05Z","lastTransitionTime":"2026-02-01T19:35:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.097582 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.097649 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.097676 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.097706 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.097732 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.200842 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.200905 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.200925 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.200961 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.200988 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.228379 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 06:37:11.262042715 +0000 UTC Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.234187 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.234215 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:06 crc kubenswrapper[4961]: E0201 19:35:06.234351 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:06 crc kubenswrapper[4961]: E0201 19:35:06.234676 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.234796 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:06 crc kubenswrapper[4961]: E0201 19:35:06.235021 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.261291 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.282904 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.303512 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.303583 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.303653 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.303686 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.303707 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.315397 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:57Z\\\",\\\"message\\\":\\\"k-target-xd92c\\\\nI0201 19:34:57.054941 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5pnzb\\\\nI0201 19:34:57.054954 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055026 6612 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055095 6612 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nF0201 19:34:57.055102 6612 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.342278 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.363155 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.384899 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.407125 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.407253 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.407336 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.407406 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.407432 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.407752 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.445124 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.467179 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.492592 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.511112 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.511161 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.511180 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.511206 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.511226 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.516912 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.538926 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.556361 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.575149 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.594154 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.615226 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.616485 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.616544 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.616561 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.616590 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.616608 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.632800 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.652895 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.713389 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.713467 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.713484 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.713515 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.713533 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: E0201 19:35:06.736831 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.742642 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.742726 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.742751 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.742785 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.742812 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: E0201 19:35:06.766908 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.773087 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.773175 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.773197 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.773224 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.773245 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: E0201 19:35:06.795173 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.801589 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.801665 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.801690 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.801723 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.801747 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: E0201 19:35:06.823941 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.829749 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.829801 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.829813 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.829833 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.829850 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: E0201 19:35:06.874493 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:06Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:06 crc kubenswrapper[4961]: E0201 19:35:06.874794 4961 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.877702 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.877779 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.877803 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.877838 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.877862 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.981607 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.981701 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.981727 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.981764 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:06 crc kubenswrapper[4961]: I0201 19:35:06.981795 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:06Z","lastTransitionTime":"2026-02-01T19:35:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.085279 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.085386 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.085413 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.085446 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.085468 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:07Z","lastTransitionTime":"2026-02-01T19:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.189174 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.189238 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.189251 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.189272 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.189284 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:07Z","lastTransitionTime":"2026-02-01T19:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.228852 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 02:01:42.733763338 +0000 UTC Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.234369 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:07 crc kubenswrapper[4961]: E0201 19:35:07.234587 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.292194 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.292264 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.292289 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.292316 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.292339 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:07Z","lastTransitionTime":"2026-02-01T19:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.396304 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.396382 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.396405 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.396436 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.396459 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:07Z","lastTransitionTime":"2026-02-01T19:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.500368 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.500444 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.500463 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.500492 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.500511 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:07Z","lastTransitionTime":"2026-02-01T19:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.603883 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.603974 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.603996 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.604028 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.604082 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:07Z","lastTransitionTime":"2026-02-01T19:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.707378 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.707453 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.707477 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.707502 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.707522 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:07Z","lastTransitionTime":"2026-02-01T19:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.811549 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.811650 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.811671 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.811706 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.811730 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:07Z","lastTransitionTime":"2026-02-01T19:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.915394 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.915484 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.915508 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.915539 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:07 crc kubenswrapper[4961]: I0201 19:35:07.915562 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:07Z","lastTransitionTime":"2026-02-01T19:35:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.019848 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.019994 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.020012 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.020132 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.020166 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:08Z","lastTransitionTime":"2026-02-01T19:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.124990 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.125050 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.125059 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.125085 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.125095 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:08Z","lastTransitionTime":"2026-02-01T19:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.227743 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.227800 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.227812 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.227832 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.227845 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:08Z","lastTransitionTime":"2026-02-01T19:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.229935 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 02:45:17.98660093 +0000 UTC Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.234341 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.234418 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.234487 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:08 crc kubenswrapper[4961]: E0201 19:35:08.234622 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:08 crc kubenswrapper[4961]: E0201 19:35:08.234772 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:08 crc kubenswrapper[4961]: E0201 19:35:08.234829 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.330944 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.331003 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.331014 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.331032 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.331061 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:08Z","lastTransitionTime":"2026-02-01T19:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.434231 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.434276 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.434286 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.434304 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.434314 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:08Z","lastTransitionTime":"2026-02-01T19:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.538447 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.538495 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.538523 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.538545 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.538562 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:08Z","lastTransitionTime":"2026-02-01T19:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.641874 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.642649 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.642718 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.642806 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.642851 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:08Z","lastTransitionTime":"2026-02-01T19:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.747358 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.747418 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.747432 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.747457 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.747470 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:08Z","lastTransitionTime":"2026-02-01T19:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.850726 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.850809 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.850831 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.850864 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.850884 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:08Z","lastTransitionTime":"2026-02-01T19:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.955465 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.955534 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.955553 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.955580 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:08 crc kubenswrapper[4961]: I0201 19:35:08.955597 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:08Z","lastTransitionTime":"2026-02-01T19:35:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.059303 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.059555 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.059619 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.059689 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.059753 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:09Z","lastTransitionTime":"2026-02-01T19:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.163753 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.163816 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.163836 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.163864 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.163883 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:09Z","lastTransitionTime":"2026-02-01T19:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.230858 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 01:26:13.466522438 +0000 UTC Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.234305 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:09 crc kubenswrapper[4961]: E0201 19:35:09.234664 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.266971 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.267024 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.267067 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.267089 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.267101 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:09Z","lastTransitionTime":"2026-02-01T19:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.369797 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.369869 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.369885 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.369910 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.369929 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:09Z","lastTransitionTime":"2026-02-01T19:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.472978 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.473072 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.473092 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.473119 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.473137 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:09Z","lastTransitionTime":"2026-02-01T19:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.577366 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.577426 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.577444 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.577471 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.577492 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:09Z","lastTransitionTime":"2026-02-01T19:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.679969 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.680085 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.680104 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.680135 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.680154 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:09Z","lastTransitionTime":"2026-02-01T19:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.783880 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.783945 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.783963 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.783990 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.784008 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:09Z","lastTransitionTime":"2026-02-01T19:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.887690 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.887787 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.887816 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.887850 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.887874 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:09Z","lastTransitionTime":"2026-02-01T19:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.990950 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.991015 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.991070 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.991103 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:09 crc kubenswrapper[4961]: I0201 19:35:09.991127 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:09Z","lastTransitionTime":"2026-02-01T19:35:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.094923 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.094988 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.095008 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.095033 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.095087 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:10Z","lastTransitionTime":"2026-02-01T19:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.197701 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.197761 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.197780 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.197807 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.197826 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:10Z","lastTransitionTime":"2026-02-01T19:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.231585 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 13:11:54.402606115 +0000 UTC Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.235151 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.235291 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:10 crc kubenswrapper[4961]: E0201 19:35:10.235385 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:10 crc kubenswrapper[4961]: E0201 19:35:10.235646 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.235898 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:10 crc kubenswrapper[4961]: E0201 19:35:10.236090 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.301148 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.301213 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.301222 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.301240 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.301252 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:10Z","lastTransitionTime":"2026-02-01T19:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.405376 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.405439 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.405461 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.405493 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.405515 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:10Z","lastTransitionTime":"2026-02-01T19:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.508748 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.508817 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.508834 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.508867 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.508885 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:10Z","lastTransitionTime":"2026-02-01T19:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.611881 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.611932 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.611943 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.611963 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.611977 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:10Z","lastTransitionTime":"2026-02-01T19:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.713894 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.713930 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.713938 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.713954 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.713962 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:10Z","lastTransitionTime":"2026-02-01T19:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.817120 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.817166 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.817179 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.817199 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.817212 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:10Z","lastTransitionTime":"2026-02-01T19:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.919433 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.919556 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.919575 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.919595 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:10 crc kubenswrapper[4961]: I0201 19:35:10.919633 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:10Z","lastTransitionTime":"2026-02-01T19:35:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.021827 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.021881 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.021896 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.021921 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.021938 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:11Z","lastTransitionTime":"2026-02-01T19:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.124657 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.124707 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.124719 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.124736 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.124748 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:11Z","lastTransitionTime":"2026-02-01T19:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.227866 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.227917 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.227934 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.227959 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.227978 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:11Z","lastTransitionTime":"2026-02-01T19:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.232200 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 08:06:04.335062964 +0000 UTC Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.234523 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:11 crc kubenswrapper[4961]: E0201 19:35:11.234697 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.331121 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.331188 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.331213 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.331246 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.331270 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:11Z","lastTransitionTime":"2026-02-01T19:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.433297 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.433639 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.433800 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.433954 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.434126 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:11Z","lastTransitionTime":"2026-02-01T19:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.536777 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.536828 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.536836 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.536855 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.536865 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:11Z","lastTransitionTime":"2026-02-01T19:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.639632 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.639688 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.639706 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.639730 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.639746 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:11Z","lastTransitionTime":"2026-02-01T19:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.742527 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.742575 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.742584 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.742599 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.742610 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:11Z","lastTransitionTime":"2026-02-01T19:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.845128 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.845196 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.845216 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.845248 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.845270 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:11Z","lastTransitionTime":"2026-02-01T19:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.947120 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.947158 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.947170 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.947187 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:11 crc kubenswrapper[4961]: I0201 19:35:11.947197 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:11Z","lastTransitionTime":"2026-02-01T19:35:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.049299 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.049397 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.049424 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.049463 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.049484 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:12Z","lastTransitionTime":"2026-02-01T19:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.152615 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.152661 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.152671 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.152690 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.152704 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:12Z","lastTransitionTime":"2026-02-01T19:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.233262 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 12:55:14.811499396 +0000 UTC Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.234478 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.234810 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:12 crc kubenswrapper[4961]: E0201 19:35:12.234799 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.234855 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:12 crc kubenswrapper[4961]: E0201 19:35:12.235217 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:12 crc kubenswrapper[4961]: E0201 19:35:12.235330 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.235547 4961 scope.go:117] "RemoveContainer" containerID="5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf" Feb 01 19:35:12 crc kubenswrapper[4961]: E0201 19:35:12.235795 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.254867 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.254899 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.254910 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.254926 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.254937 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:12Z","lastTransitionTime":"2026-02-01T19:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.358146 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.358194 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.358205 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.358226 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.358238 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:12Z","lastTransitionTime":"2026-02-01T19:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.461027 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.461139 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.461158 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.461193 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.461216 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:12Z","lastTransitionTime":"2026-02-01T19:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.564267 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.564350 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.564367 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.564395 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.564483 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:12Z","lastTransitionTime":"2026-02-01T19:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.667241 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.667302 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.667320 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.667345 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.667364 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:12Z","lastTransitionTime":"2026-02-01T19:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.769887 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.769926 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.769939 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.769955 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.769970 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:12Z","lastTransitionTime":"2026-02-01T19:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.871870 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.871940 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.871958 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.871984 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.872004 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:12Z","lastTransitionTime":"2026-02-01T19:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.974580 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.974623 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.974634 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.974652 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:12 crc kubenswrapper[4961]: I0201 19:35:12.974663 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:12Z","lastTransitionTime":"2026-02-01T19:35:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.076301 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.076333 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.076341 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.076354 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.076364 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:13Z","lastTransitionTime":"2026-02-01T19:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.178108 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.178137 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.178147 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.178159 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.178167 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:13Z","lastTransitionTime":"2026-02-01T19:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.233782 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 20:54:01.789126739 +0000 UTC Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.235007 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:13 crc kubenswrapper[4961]: E0201 19:35:13.235129 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.280782 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.280901 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.280924 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.280995 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.281015 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:13Z","lastTransitionTime":"2026-02-01T19:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.383175 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.383414 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.383493 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.383566 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.383628 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:13Z","lastTransitionTime":"2026-02-01T19:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.486736 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.486803 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.486822 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.486848 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.486866 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:13Z","lastTransitionTime":"2026-02-01T19:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.594132 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.594192 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.594205 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.594229 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.594246 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:13Z","lastTransitionTime":"2026-02-01T19:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.697695 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.697811 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.697920 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.697964 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.697983 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:13Z","lastTransitionTime":"2026-02-01T19:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.756808 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:13 crc kubenswrapper[4961]: E0201 19:35:13.756968 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:35:13 crc kubenswrapper[4961]: E0201 19:35:13.757109 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs podName:b35577aa-3e17-4907-80d9-b8911ea55c25 nodeName:}" failed. No retries permitted until 2026-02-01 19:35:45.75706957 +0000 UTC m=+100.282101033 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs") pod "network-metrics-daemon-j6fs5" (UID: "b35577aa-3e17-4907-80d9-b8911ea55c25") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.801382 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.801425 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.801439 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.801458 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.801471 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:13Z","lastTransitionTime":"2026-02-01T19:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.903936 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.903989 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.904004 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.904026 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:13 crc kubenswrapper[4961]: I0201 19:35:13.904069 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:13Z","lastTransitionTime":"2026-02-01T19:35:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.006712 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.006758 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.006769 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.006785 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.006795 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:14Z","lastTransitionTime":"2026-02-01T19:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.108953 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.109003 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.109016 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.109052 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.109063 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:14Z","lastTransitionTime":"2026-02-01T19:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.211392 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.211430 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.211438 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.211452 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.211461 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:14Z","lastTransitionTime":"2026-02-01T19:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.234863 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 06:20:42.811209382 +0000 UTC Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.234997 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.235007 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.235077 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:14 crc kubenswrapper[4961]: E0201 19:35:14.235184 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:14 crc kubenswrapper[4961]: E0201 19:35:14.235268 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:14 crc kubenswrapper[4961]: E0201 19:35:14.235334 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.313855 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.313910 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.313931 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.313954 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.313972 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:14Z","lastTransitionTime":"2026-02-01T19:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.417976 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.418077 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.418100 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.418127 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.418147 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:14Z","lastTransitionTime":"2026-02-01T19:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.520921 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.520971 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.521005 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.521026 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.521056 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:14Z","lastTransitionTime":"2026-02-01T19:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.623085 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.623121 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.623130 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.623146 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.623155 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:14Z","lastTransitionTime":"2026-02-01T19:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.725536 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.725582 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.725593 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.725611 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.725626 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:14Z","lastTransitionTime":"2026-02-01T19:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.828979 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.829028 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.829057 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.829078 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.829090 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:14Z","lastTransitionTime":"2026-02-01T19:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.932189 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.932249 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.932264 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.932285 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:14 crc kubenswrapper[4961]: I0201 19:35:14.932300 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:14Z","lastTransitionTime":"2026-02-01T19:35:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.034591 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.034641 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.034655 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.034674 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.034690 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:15Z","lastTransitionTime":"2026-02-01T19:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.137395 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.137445 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.137456 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.137474 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.137484 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:15Z","lastTransitionTime":"2026-02-01T19:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.234608 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:15 crc kubenswrapper[4961]: E0201 19:35:15.234878 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.235155 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 07:09:04.78861175 +0000 UTC Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.240396 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.240425 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.240435 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.240449 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.240459 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:15Z","lastTransitionTime":"2026-02-01T19:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.250146 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.343814 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.343882 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.343901 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.343927 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.343949 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:15Z","lastTransitionTime":"2026-02-01T19:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.447724 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.447780 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.447797 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.447818 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.447833 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:15Z","lastTransitionTime":"2026-02-01T19:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.550704 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.550793 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.550812 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.550840 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.550859 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:15Z","lastTransitionTime":"2026-02-01T19:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.653698 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.653746 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.653758 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.653778 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.653790 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:15Z","lastTransitionTime":"2026-02-01T19:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.724723 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wcr65_9a7db599-5d41-4e19-b703-6bb56822b4ff/kube-multus/0.log" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.724802 4961 generic.go:334] "Generic (PLEG): container finished" podID="9a7db599-5d41-4e19-b703-6bb56822b4ff" containerID="254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f" exitCode=1 Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.724900 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wcr65" event={"ID":"9a7db599-5d41-4e19-b703-6bb56822b4ff","Type":"ContainerDied","Data":"254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.727598 4961 scope.go:117] "RemoveContainer" containerID="254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.742560 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d33fdd6d-4e5d-494d-acf0-b4f1966d9801\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b722892015dd57cae96444bd3f2103cd1ab5ed91d451a5d7d8484b72433942bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.755763 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.755801 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.755815 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.755835 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.755847 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:15Z","lastTransitionTime":"2026-02-01T19:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.761694 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.778453 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.791724 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.805220 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.817784 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.836466 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:35:14Z\\\",\\\"message\\\":\\\"2026-02-01T19:34:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d\\\\n2026-02-01T19:34:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d to /host/opt/cni/bin/\\\\n2026-02-01T19:34:29Z [verbose] multus-daemon started\\\\n2026-02-01T19:34:29Z [verbose] Readiness Indicator file check\\\\n2026-02-01T19:35:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.852063 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.864459 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.864537 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.864572 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.864587 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.864624 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:15Z","lastTransitionTime":"2026-02-01T19:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.867839 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.884810 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.897446 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.917597 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:57Z\\\",\\\"message\\\":\\\"k-target-xd92c\\\\nI0201 19:34:57.054941 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5pnzb\\\\nI0201 19:34:57.054954 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055026 6612 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055095 6612 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nF0201 19:34:57.055102 6612 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.969028 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.969081 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.969090 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.969123 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.969133 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:15Z","lastTransitionTime":"2026-02-01T19:35:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:15 crc kubenswrapper[4961]: I0201 19:35:15.989563 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:15Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.024931 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.040580 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.056781 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.071263 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.071325 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.071336 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.071353 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.071363 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:16Z","lastTransitionTime":"2026-02-01T19:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.078678 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.095178 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.110710 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.173790 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.173824 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.173832 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.173846 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.173855 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:16Z","lastTransitionTime":"2026-02-01T19:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.234837 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.234857 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:16 crc kubenswrapper[4961]: E0201 19:35:16.234966 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.234837 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:16 crc kubenswrapper[4961]: E0201 19:35:16.235107 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:16 crc kubenswrapper[4961]: E0201 19:35:16.235197 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.235199 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 06:33:33.557573701 +0000 UTC Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.248464 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.260263 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d33fdd6d-4e5d-494d-acf0-b4f1966d9801\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b722892015dd57cae96444bd3f2103cd1ab5ed91d451a5d7d8484b72433942bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.276431 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.276467 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.276477 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.276507 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.276516 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:16Z","lastTransitionTime":"2026-02-01T19:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.279649 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.296378 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.307649 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.321621 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.339978 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.355307 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:35:14Z\\\",\\\"message\\\":\\\"2026-02-01T19:34:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d\\\\n2026-02-01T19:34:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d to /host/opt/cni/bin/\\\\n2026-02-01T19:34:29Z [verbose] multus-daemon started\\\\n2026-02-01T19:34:29Z [verbose] Readiness Indicator file check\\\\n2026-02-01T19:35:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.367933 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.378693 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.378724 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.378733 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.378749 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.378760 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:16Z","lastTransitionTime":"2026-02-01T19:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.385557 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.401634 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.414313 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.441295 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:57Z\\\",\\\"message\\\":\\\"k-target-xd92c\\\\nI0201 19:34:57.054941 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5pnzb\\\\nI0201 19:34:57.054954 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055026 6612 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055095 6612 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nF0201 19:34:57.055102 6612 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.460320 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.476232 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.481546 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.481575 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.481587 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.481607 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.481648 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:16Z","lastTransitionTime":"2026-02-01T19:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.494361 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.529458 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.545584 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.564511 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.584412 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.584449 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.584459 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.584478 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.584489 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:16Z","lastTransitionTime":"2026-02-01T19:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.687895 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.687937 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.687947 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.687964 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.687974 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:16Z","lastTransitionTime":"2026-02-01T19:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.728638 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wcr65_9a7db599-5d41-4e19-b703-6bb56822b4ff/kube-multus/0.log" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.728679 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wcr65" event={"ID":"9a7db599-5d41-4e19-b703-6bb56822b4ff","Type":"ContainerStarted","Data":"fb41218a133691c40c5cf0418e394a5a62b3036af754f0990622a4fa579377c9"} Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.748424 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.765359 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.778515 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.791354 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.791407 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.791422 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.791447 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.791464 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:16Z","lastTransitionTime":"2026-02-01T19:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.792422 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d33fdd6d-4e5d-494d-acf0-b4f1966d9801\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b722892015dd57cae96444bd3f2103cd1ab5ed91d451a5d7d8484b72433942bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.809683 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb41218a133691c40c5cf0418e394a5a62b3036af754f0990622a4fa579377c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:35:14Z\\\",\\\"message\\\":\\\"2026-02-01T19:34:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d\\\\n2026-02-01T19:34:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d to /host/opt/cni/bin/\\\\n2026-02-01T19:34:29Z [verbose] multus-daemon started\\\\n2026-02-01T19:34:29Z [verbose] Readiness Indicator file check\\\\n2026-02-01T19:35:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.819710 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.831124 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.845203 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.859453 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.871716 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.890736 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.904447 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.920213 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.920251 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.920261 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.920278 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.920289 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:16Z","lastTransitionTime":"2026-02-01T19:35:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.922168 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.937430 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.953988 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:57Z\\\",\\\"message\\\":\\\"k-target-xd92c\\\\nI0201 19:34:57.054941 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5pnzb\\\\nI0201 19:34:57.054954 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055026 6612 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055095 6612 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nF0201 19:34:57.055102 6612 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.968941 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:16 crc kubenswrapper[4961]: I0201 19:35:16.984074 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.000400 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:16Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.017498 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:17Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.022274 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.022302 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.022311 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.022326 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.022336 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.124593 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.124660 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.124680 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.124708 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.124726 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.227613 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.227907 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.227923 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.227942 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.228140 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.235125 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:17 crc kubenswrapper[4961]: E0201 19:35:17.235324 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.235371 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 08:31:50.911040183 +0000 UTC Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.270320 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.270476 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.270565 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.270674 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.270766 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: E0201 19:35:17.286690 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:17Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.293423 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.293493 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.293510 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.293536 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.293558 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: E0201 19:35:17.309360 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:17Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.314350 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.314419 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.314438 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.314465 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.314484 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: E0201 19:35:17.336322 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:17Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.340694 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.340720 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.340729 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.340744 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.340756 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: E0201 19:35:17.358993 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:17Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.362575 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.362610 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.362621 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.362634 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.362647 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: E0201 19:35:17.378677 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:17Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:17 crc kubenswrapper[4961]: E0201 19:35:17.378791 4961 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.380597 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.380775 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.380907 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.381066 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.381193 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.483963 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.484014 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.484025 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.484055 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.484065 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.587505 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.587823 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.587953 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.588127 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.588270 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.690959 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.691018 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.691067 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.691094 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.691111 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.793899 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.794168 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.794286 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.794384 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.794462 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.897006 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.897173 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.897231 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.897309 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:17 crc kubenswrapper[4961]: I0201 19:35:17.897370 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:17Z","lastTransitionTime":"2026-02-01T19:35:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.000146 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.000527 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.000652 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.000844 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.001012 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:18Z","lastTransitionTime":"2026-02-01T19:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.104234 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.104279 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.104290 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.104309 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.104322 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:18Z","lastTransitionTime":"2026-02-01T19:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.207130 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.207165 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.207175 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.207191 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.207200 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:18Z","lastTransitionTime":"2026-02-01T19:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.234600 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.234673 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.234756 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.235805 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 19:56:39.211610329 +0000 UTC Feb 01 19:35:18 crc kubenswrapper[4961]: E0201 19:35:18.236441 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:18 crc kubenswrapper[4961]: E0201 19:35:18.236533 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:18 crc kubenswrapper[4961]: E0201 19:35:18.236587 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.310093 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.310140 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.310154 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.310175 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.310186 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:18Z","lastTransitionTime":"2026-02-01T19:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.413157 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.413206 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.413217 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.413233 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.413246 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:18Z","lastTransitionTime":"2026-02-01T19:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.515508 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.515578 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.515591 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.515613 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.515628 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:18Z","lastTransitionTime":"2026-02-01T19:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.618915 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.618947 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.618955 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.618970 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.618978 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:18Z","lastTransitionTime":"2026-02-01T19:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.721644 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.721705 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.721721 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.721742 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.721755 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:18Z","lastTransitionTime":"2026-02-01T19:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.826530 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.826575 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.826589 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.826607 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.826620 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:18Z","lastTransitionTime":"2026-02-01T19:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.928554 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.928620 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.928642 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.928674 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:18 crc kubenswrapper[4961]: I0201 19:35:18.928692 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:18Z","lastTransitionTime":"2026-02-01T19:35:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.030905 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.030962 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.030979 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.031006 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.031029 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:19Z","lastTransitionTime":"2026-02-01T19:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.133727 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.133802 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.133824 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.133868 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.133913 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:19Z","lastTransitionTime":"2026-02-01T19:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.234891 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:19 crc kubenswrapper[4961]: E0201 19:35:19.235136 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.235960 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 04:05:17.917393136 +0000 UTC Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.236573 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.236603 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.236614 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.236628 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.236643 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:19Z","lastTransitionTime":"2026-02-01T19:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.339531 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.339576 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.339588 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.339605 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.339616 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:19Z","lastTransitionTime":"2026-02-01T19:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.441920 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.442142 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.442178 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.442211 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.442237 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:19Z","lastTransitionTime":"2026-02-01T19:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.545570 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.545638 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.545667 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.545701 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.545727 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:19Z","lastTransitionTime":"2026-02-01T19:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.649142 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.649179 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.649191 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.649206 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.649217 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:19Z","lastTransitionTime":"2026-02-01T19:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.751473 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.751522 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.751538 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.751561 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.751578 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:19Z","lastTransitionTime":"2026-02-01T19:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.853630 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.853671 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.853689 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.853711 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.853727 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:19Z","lastTransitionTime":"2026-02-01T19:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.956098 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.956153 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.956172 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.956194 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:19 crc kubenswrapper[4961]: I0201 19:35:19.956211 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:19Z","lastTransitionTime":"2026-02-01T19:35:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.058208 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.058245 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.058255 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.058269 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.058279 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:20Z","lastTransitionTime":"2026-02-01T19:35:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.160678 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.160719 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.160732 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.160748 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.160758 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:20Z","lastTransitionTime":"2026-02-01T19:35:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.234600 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:20 crc kubenswrapper[4961]: E0201 19:35:20.234806 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.234629 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:20 crc kubenswrapper[4961]: E0201 19:35:20.234949 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.234606 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:20 crc kubenswrapper[4961]: E0201 19:35:20.235102 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.236663 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 20:47:11.222283913 +0000 UTC Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.262884 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.262920 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.262930 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.262948 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.262958 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:20Z","lastTransitionTime":"2026-02-01T19:35:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.365885 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.365951 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.365971 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.365996 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.366013 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:20Z","lastTransitionTime":"2026-02-01T19:35:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.468005 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.468070 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.468080 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.468095 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.468104 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:20Z","lastTransitionTime":"2026-02-01T19:35:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.570867 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.570936 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.570954 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.570979 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.571000 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:20Z","lastTransitionTime":"2026-02-01T19:35:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.673549 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.673617 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.673636 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.673664 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.673681 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:20Z","lastTransitionTime":"2026-02-01T19:35:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.775864 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.775943 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.775962 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.775988 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.776008 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:20Z","lastTransitionTime":"2026-02-01T19:35:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.879242 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.879317 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.879335 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.879362 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.879381 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:20Z","lastTransitionTime":"2026-02-01T19:35:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.981396 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.981463 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.981480 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.981505 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:20 crc kubenswrapper[4961]: I0201 19:35:20.981523 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:20Z","lastTransitionTime":"2026-02-01T19:35:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.084844 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.084908 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.084925 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.084951 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.084968 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:21Z","lastTransitionTime":"2026-02-01T19:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.188247 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.188312 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.188330 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.188356 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.188375 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:21Z","lastTransitionTime":"2026-02-01T19:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.234136 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:21 crc kubenswrapper[4961]: E0201 19:35:21.234342 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.237270 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 19:39:39.068161739 +0000 UTC Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.291481 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.291553 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.291574 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.291598 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.291616 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:21Z","lastTransitionTime":"2026-02-01T19:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.394639 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.394773 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.394796 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.394819 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.394840 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:21Z","lastTransitionTime":"2026-02-01T19:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.497358 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.497418 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.497437 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.497460 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.497480 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:21Z","lastTransitionTime":"2026-02-01T19:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.600793 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.600853 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.600866 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.600887 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.600902 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:21Z","lastTransitionTime":"2026-02-01T19:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.704172 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.704241 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.704259 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.704284 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.704304 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:21Z","lastTransitionTime":"2026-02-01T19:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.807691 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.807759 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.807777 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.807805 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.807831 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:21Z","lastTransitionTime":"2026-02-01T19:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.911246 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.911317 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.911336 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.911364 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:21 crc kubenswrapper[4961]: I0201 19:35:21.911383 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:21Z","lastTransitionTime":"2026-02-01T19:35:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.019121 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.019179 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.019198 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.019222 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.019239 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:22Z","lastTransitionTime":"2026-02-01T19:35:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.122854 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.122927 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.122951 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.122982 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.123005 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:22Z","lastTransitionTime":"2026-02-01T19:35:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.226562 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.226611 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.226626 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.226649 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.226666 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:22Z","lastTransitionTime":"2026-02-01T19:35:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.235344 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.235429 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:22 crc kubenswrapper[4961]: E0201 19:35:22.235537 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:22 crc kubenswrapper[4961]: E0201 19:35:22.235703 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.235808 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:22 crc kubenswrapper[4961]: E0201 19:35:22.235940 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.238078 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 23:28:00.472216307 +0000 UTC Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.329526 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.329588 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.329606 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.329633 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.329653 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:22Z","lastTransitionTime":"2026-02-01T19:35:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.433118 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.433205 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.433227 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.433252 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.433272 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:22Z","lastTransitionTime":"2026-02-01T19:35:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.536108 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.536180 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.536224 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.536252 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.536273 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:22Z","lastTransitionTime":"2026-02-01T19:35:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.640359 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.640457 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.640478 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.640506 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.640529 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:22Z","lastTransitionTime":"2026-02-01T19:35:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.743257 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.743307 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.743320 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.743343 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.743357 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:22Z","lastTransitionTime":"2026-02-01T19:35:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.846523 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.846591 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.846609 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.846631 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.846648 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:22Z","lastTransitionTime":"2026-02-01T19:35:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.950138 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.950211 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.950230 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.950265 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:22 crc kubenswrapper[4961]: I0201 19:35:22.950284 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:22Z","lastTransitionTime":"2026-02-01T19:35:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.053798 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.053858 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.053880 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.053908 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.053929 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:23Z","lastTransitionTime":"2026-02-01T19:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.157621 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.157684 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.157702 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.157734 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.157754 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:23Z","lastTransitionTime":"2026-02-01T19:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.234606 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:23 crc kubenswrapper[4961]: E0201 19:35:23.234812 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.238735 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 13:07:48.71243941 +0000 UTC Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.261363 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.261422 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.261445 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.261470 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.261489 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:23Z","lastTransitionTime":"2026-02-01T19:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.376635 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.376695 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.376730 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.376755 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.376796 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:23Z","lastTransitionTime":"2026-02-01T19:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.479593 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.479676 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.479706 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.479739 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.479759 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:23Z","lastTransitionTime":"2026-02-01T19:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.583345 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.583430 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.583449 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.583478 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.583500 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:23Z","lastTransitionTime":"2026-02-01T19:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.686815 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.686882 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.686903 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.686932 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.686955 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:23Z","lastTransitionTime":"2026-02-01T19:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.789860 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.789933 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.789951 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.789980 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.790000 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:23Z","lastTransitionTime":"2026-02-01T19:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.892294 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.892385 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.892410 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.892438 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.892462 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:23Z","lastTransitionTime":"2026-02-01T19:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.995737 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.995810 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.995835 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.995867 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:23 crc kubenswrapper[4961]: I0201 19:35:23.995892 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:23Z","lastTransitionTime":"2026-02-01T19:35:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.099690 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.099768 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.099790 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.099817 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.099835 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:24Z","lastTransitionTime":"2026-02-01T19:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.203333 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.203411 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.203432 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.203461 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.203482 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:24Z","lastTransitionTime":"2026-02-01T19:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.235093 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.235139 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.235510 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:24 crc kubenswrapper[4961]: E0201 19:35:24.235515 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:24 crc kubenswrapper[4961]: E0201 19:35:24.235691 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:24 crc kubenswrapper[4961]: E0201 19:35:24.235829 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.236936 4961 scope.go:117] "RemoveContainer" containerID="5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.238940 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 00:23:42.387221785 +0000 UTC Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.308794 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.308948 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.308969 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.310847 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.310921 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:24Z","lastTransitionTime":"2026-02-01T19:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.413701 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.413760 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.413780 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.413809 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.413830 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:24Z","lastTransitionTime":"2026-02-01T19:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.516980 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.517072 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.517092 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.517118 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.517137 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:24Z","lastTransitionTime":"2026-02-01T19:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.620606 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.620696 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.620716 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.620745 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.620764 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:24Z","lastTransitionTime":"2026-02-01T19:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.723827 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.723917 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.723944 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.723979 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.724006 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:24Z","lastTransitionTime":"2026-02-01T19:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.759219 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/2.log" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.763970 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170"} Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.764752 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.784083 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d33fdd6d-4e5d-494d-acf0-b4f1966d9801\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b722892015dd57cae96444bd3f2103cd1ab5ed91d451a5d7d8484b72433942bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.815704 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.826893 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.826937 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.826951 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.826974 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.826989 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:24Z","lastTransitionTime":"2026-02-01T19:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.831471 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.852827 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.865850 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.880663 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb41218a133691c40c5cf0418e394a5a62b3036af754f0990622a4fa579377c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:35:14Z\\\",\\\"message\\\":\\\"2026-02-01T19:34:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d\\\\n2026-02-01T19:34:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d to /host/opt/cni/bin/\\\\n2026-02-01T19:34:29Z [verbose] multus-daemon started\\\\n2026-02-01T19:34:29Z [verbose] Readiness Indicator file check\\\\n2026-02-01T19:35:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.894194 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.908731 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.920534 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.930255 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.930286 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.930298 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.930316 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.930329 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:24Z","lastTransitionTime":"2026-02-01T19:35:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.942071 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.955383 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.969805 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:24 crc kubenswrapper[4961]: I0201 19:35:24.990790 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:24Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.006468 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.020103 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.033804 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.033869 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.033890 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.033909 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.033921 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:25Z","lastTransitionTime":"2026-02-01T19:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.047464 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:57Z\\\",\\\"message\\\":\\\"k-target-xd92c\\\\nI0201 19:34:57.054941 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5pnzb\\\\nI0201 19:34:57.054954 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055026 6612 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055095 6612 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nF0201 19:34:57.055102 6612 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.068146 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.081151 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.101829 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.135985 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.136014 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.136022 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.136054 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.136064 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:25Z","lastTransitionTime":"2026-02-01T19:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.234867 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:25 crc kubenswrapper[4961]: E0201 19:35:25.235074 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.238448 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.238479 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.238493 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.238511 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.238524 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:25Z","lastTransitionTime":"2026-02-01T19:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.239011 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 12:34:28.638847707 +0000 UTC Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.341231 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.341299 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.341315 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.341346 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.341363 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:25Z","lastTransitionTime":"2026-02-01T19:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.444074 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.444148 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.444162 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.444181 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.444194 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:25Z","lastTransitionTime":"2026-02-01T19:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.546262 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.546292 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.546301 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.546314 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.546323 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:25Z","lastTransitionTime":"2026-02-01T19:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.649413 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.649477 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.649495 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.649521 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.649541 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:25Z","lastTransitionTime":"2026-02-01T19:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.751416 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.751450 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.751459 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.751472 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.751482 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:25Z","lastTransitionTime":"2026-02-01T19:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.768574 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/3.log" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.769026 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/2.log" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.771415 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170" exitCode=1 Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.771486 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170"} Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.771614 4961 scope.go:117] "RemoveContainer" containerID="5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.772882 4961 scope.go:117] "RemoveContainer" containerID="cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170" Feb 01 19:35:25 crc kubenswrapper[4961]: E0201 19:35:25.773056 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.793179 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.804337 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.827991 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.840838 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d33fdd6d-4e5d-494d-acf0-b4f1966d9801\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b722892015dd57cae96444bd3f2103cd1ab5ed91d451a5d7d8484b72433942bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.853729 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.853765 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.853775 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.853794 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.853807 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:25Z","lastTransitionTime":"2026-02-01T19:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.855289 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.866258 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.874736 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.883768 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.899362 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb41218a133691c40c5cf0418e394a5a62b3036af754f0990622a4fa579377c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:35:14Z\\\",\\\"message\\\":\\\"2026-02-01T19:34:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d\\\\n2026-02-01T19:34:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d to /host/opt/cni/bin/\\\\n2026-02-01T19:34:29Z [verbose] multus-daemon started\\\\n2026-02-01T19:34:29Z [verbose] Readiness Indicator file check\\\\n2026-02-01T19:35:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.910808 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.921734 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.932322 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.946415 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.956490 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.956551 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.956568 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.956595 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.956611 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:25Z","lastTransitionTime":"2026-02-01T19:35:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.957805 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.969809 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.981863 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:25 crc kubenswrapper[4961]: I0201 19:35:25.995470 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:25Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.007614 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.025547 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:57Z\\\",\\\"message\\\":\\\"k-target-xd92c\\\\nI0201 19:34:57.054941 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5pnzb\\\\nI0201 19:34:57.054954 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055026 6612 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055095 6612 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nF0201 19:34:57.055102 6612 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:35:25Z\\\",\\\"message\\\":\\\" Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-controllers\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.167\\\\\\\", Port:8441, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.167\\\\\\\", Port:8442, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.167\\\\\\\", Port:8444, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0201 19:35:25.241845 7001 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0201 19:35:25.241792 7001 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node net\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.059786 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.059855 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.059884 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.059917 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.059940 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:26Z","lastTransitionTime":"2026-02-01T19:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.162834 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.162895 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.162911 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.162936 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.162951 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:26Z","lastTransitionTime":"2026-02-01T19:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.234572 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.234687 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:26 crc kubenswrapper[4961]: E0201 19:35:26.234794 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:26 crc kubenswrapper[4961]: E0201 19:35:26.234984 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.235107 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:26 crc kubenswrapper[4961]: E0201 19:35:26.235279 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.239260 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 09:02:07.02574343 +0000 UTC Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.250326 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d33fdd6d-4e5d-494d-acf0-b4f1966d9801\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b722892015dd57cae96444bd3f2103cd1ab5ed91d451a5d7d8484b72433942bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.266232 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.266307 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.266330 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.266363 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.266386 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:26Z","lastTransitionTime":"2026-02-01T19:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.272901 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.294435 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.306625 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.318646 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.338395 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb41218a133691c40c5cf0418e394a5a62b3036af754f0990622a4fa579377c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:35:14Z\\\",\\\"message\\\":\\\"2026-02-01T19:34:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d\\\\n2026-02-01T19:34:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d to /host/opt/cni/bin/\\\\n2026-02-01T19:34:29Z [verbose] multus-daemon started\\\\n2026-02-01T19:34:29Z [verbose] Readiness Indicator file check\\\\n2026-02-01T19:35:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.351853 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.362095 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.368769 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.368839 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.368858 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.368882 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.368900 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:26Z","lastTransitionTime":"2026-02-01T19:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.375297 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.387973 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.398251 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.410447 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.421765 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.432587 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.445941 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.466540 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5da741584b0af2f51c2797323b72256fdd4485c76b323fbb07f3a03cbe6a15cf\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:34:57Z\\\",\\\"message\\\":\\\"k-target-xd92c\\\\nI0201 19:34:57.054941 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-5pnzb\\\\nI0201 19:34:57.054954 6612 obj_retry.go:303] Retry object setup: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055026 6612 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI0201 19:34:57.055095 6612 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nF0201 19:34:57.055102 6612 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:34:57\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:35:25Z\\\",\\\"message\\\":\\\" Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-controllers\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.167\\\\\\\", Port:8441, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.167\\\\\\\", Port:8442, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.167\\\\\\\", Port:8444, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0201 19:35:25.241845 7001 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0201 19:35:25.241792 7001 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node net\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:35:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.472303 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.472343 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.472352 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.472368 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.472380 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:26Z","lastTransitionTime":"2026-02-01T19:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.489905 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.505725 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.533347 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.574411 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.574445 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.574457 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.574476 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.574490 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:26Z","lastTransitionTime":"2026-02-01T19:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.676600 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.676670 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.676687 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.676714 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.676733 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:26Z","lastTransitionTime":"2026-02-01T19:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.778720 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/3.log" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.780098 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.780168 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.780187 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.780217 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.780237 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:26Z","lastTransitionTime":"2026-02-01T19:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.784366 4961 scope.go:117] "RemoveContainer" containerID="cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170" Feb 01 19:35:26 crc kubenswrapper[4961]: E0201 19:35:26.784798 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.802155 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d33fdd6d-4e5d-494d-acf0-b4f1966d9801\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b722892015dd57cae96444bd3f2103cd1ab5ed91d451a5d7d8484b72433942bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://132fec993ce313d4bbf7f465441e79b47c60819ec4b20517667061335c281dce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.821753 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://535e9cefe12144eeb939ae78de276d51278853eee0521485235483f691121611\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.842631 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.859092 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5pnzb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a8256b0-9e0d-4560-bbae-08a628c26a2f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0fca47d1c5c9dddffbaaf97b7ca2eb86a2094722ffbd46b38cbcb664507f42b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xm66x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:30Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5pnzb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.877676 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f613a17-3bb1-43f0-8552-0e55a2018f4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f78671c70cd5bbd4c98d91f6b9f031830ed9ae6f21adbc6527c1ee85220f9e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2klsz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fgjhv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.882398 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.882474 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.882487 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.882509 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.882522 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:26Z","lastTransitionTime":"2026-02-01T19:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.897108 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wcr65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9a7db599-5d41-4e19-b703-6bb56822b4ff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb41218a133691c40c5cf0418e394a5a62b3036af754f0990622a4fa579377c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:35:14Z\\\",\\\"message\\\":\\\"2026-02-01T19:34:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d\\\\n2026-02-01T19:34:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5261814d-8279-4a82-80b5-b89238f15e4d to /host/opt/cni/bin/\\\\n2026-02-01T19:34:29Z [verbose] multus-daemon started\\\\n2026-02-01T19:34:29Z [verbose] Readiness Indicator file check\\\\n2026-02-01T19:35:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:35:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8l69z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wcr65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.913148 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-c9pwd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4a4b5d70-50da-41c9-8016-8134283d74bc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a9cf379e0a3166f671ebf6d6c4014f4cc444c052d532fcdffd6df26eae7d76e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l6n4c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:27Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-c9pwd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.929564 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b330f5dc-7057-4da0-8631-76683675a7a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7615ebf60fe5f4ad99fa3f5272504c533e89548f73aae36dd7eb7511e1cc936f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d3045fcd96c0e05b5f2c6b3388968f9c20e396763c4a2ce929e89f2fa52ecbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-564v9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tpzkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.945105 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b35577aa-3e17-4907-80d9-b8911ea55c25\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:41Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jc2tt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:41Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-j6fs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.975915 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1dc5aa88-1337-4077-a820-f13902334c80\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-01T19:35:25Z\\\",\\\"message\\\":\\\" Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-controllers\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.167\\\\\\\", Port:8441, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.167\\\\\\\", Port:8442, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.167\\\\\\\", Port:8444, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0201 19:35:25.241845 7001 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0201 19:35:25.241792 7001 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node net\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:35:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vg76h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ng92q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.994196 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.994264 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.994286 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.994316 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.994337 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:26Z","lastTransitionTime":"2026-02-01T19:35:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:26 crc kubenswrapper[4961]: I0201 19:35:26.997720 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b47b7336-2148-44e9-83c1-582d0cd0d16f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-01T19:34:25Z\\\",\\\"message\\\":\\\"lling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0201 19:34:19.899910 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0201 19:34:19.900539 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-737739194/tls.crt::/tmp/serving-cert-737739194/tls.key\\\\\\\"\\\\nI0201 19:34:25.494622 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0201 19:34:25.502808 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0201 19:34:25.502846 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0201 19:34:25.502892 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0201 19:34:25.502902 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0201 19:34:25.518435 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0201 19:34:25.518479 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518485 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0201 19:34:25.518491 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0201 19:34:25.518494 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0201 19:34:25.518498 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0201 19:34:25.518501 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0201 19:34:25.518529 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0201 19:34:25.524320 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:26Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.014951 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d37d4796-c213-4dd2-859e-6bb685117c00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ccf5d271def9a6e6fec2fccdd7ec32b9be357ce5ac4b517be78aefbf818fbffa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48d83141c70d30ea9e7c1a0fd05ea87e3273e15303228e51add5b042bffdadb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4747bee475e921b6145f81f7674a9fe21a7adbc06204e6dd719d7081f5fbda11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5419a5afaf292ed0df59ab08efb3f193b65b4ed4557f57753856b950108a2d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.035949 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.055132 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbaea78cf9e511b274c6b8921cfc1dbcf039eac1ba94e45bfa0858217825bdaf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.078177 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a62bd434c4db24bfe70dcac5c73f0e481bc7f30861f533f362c140c28bdb33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c83f7e518d166330aef6ec6b3afcdeef2ae634d192dbf20482b9851aa7d96fdb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.096860 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.096918 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.096935 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.096962 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.096981 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.098835 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.132736 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4de6f504-c4df-4873-84ce-2e300e8bc574\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b52b6dbd8e9ebd0e45e7283a140d6245d540b96cc29616c5c2f4f3303e1e87c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c768a4eb9a07f8d76b806466498c5d9ed9e995099e512953fab147f6c6db35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://322d8324477a03cfbd542eb5c9f165bf56160c508726cbb9ec3279be3df22ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7c8cd77d2214dacc2581404a21cb9c6d2d818313870e05c53791e9021322b73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e95c695bc3ddf700554a4582726d543938142d74fd85ce90078e047aad80e51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80fd7c6393e77736bd48261541b6ada6a3cb5c3b006bdb24d0d45d0b2b840bcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2adb9fd5027289fc200e261497cd9ae6c82e49c2c3d828bad31be5e6243b1bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbb9759feabf4430316b6f54012ed362105f39a84d3a7f23bb3c3a68d7c01bde\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.153190 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1aa90c3b-d339-45a0-a5c7-7effbb03c217\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd4cd59d2fbf288339a735277b82714a209dccd49c561285c83e846165a139e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b8e969682986c3dfcfd6b5c11e5740f9dc111d42c110af6cadc03622db7f41b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09748ade16f1f43fd006307e19bbbe4d4dbc6688603596d3d8954d914609e394\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.177859 4961 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d6fb02f0-671c-4a38-bda0-fa32ecfd6930\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-01T19:34:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://504f7ded3666e0d53bc2c4d37352ddea38c4c64ba1a363a43485274b1eac50ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-01T19:34:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ba76fe2fd6c277c2862ba7aea420e4492222a59a7878fd548b03c8be40877906\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://453b89654ddcb8c5916237a9d1e2cdd2af1a27c303f27a6923b04cc98c295165\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fb617d149da6b9079cacde3a2ee29f8fc7e445ce368b6ca9d637705f5d455444\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dee9e2bf2ca15be3aad5a4aba49419a86dbadbc444d315264396d16db6a09602\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e1bc1427e907207bb818ce29113b2edcca3281a0070ad945f5f911f58caa211\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b89ea959fd2c45eee33369eecff86d341923396d119fdaca7bdb4fce56609ebe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-01T19:34:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-01T19:34:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccxl8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-01T19:34:28Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6dzl2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.199795 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.199837 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.199847 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.199864 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.199876 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.234846 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:27 crc kubenswrapper[4961]: E0201 19:35:27.235129 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.240017 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 13:11:44.024537819 +0000 UTC Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.303280 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.303347 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.303370 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.303394 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.303413 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.406634 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.406746 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.406777 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.406813 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.406841 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.481795 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.481871 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.481889 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.481909 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.481922 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: E0201 19:35:27.503366 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.509073 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.509115 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.509127 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.509147 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.509163 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: E0201 19:35:27.531410 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.537107 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.537167 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.537195 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.537229 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.537257 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: E0201 19:35:27.559392 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.566240 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.566313 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.566333 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.566360 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.566380 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: E0201 19:35:27.590168 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.596206 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.596282 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.596301 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.596331 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.596352 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: E0201 19:35:27.619701 4961 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-01T19:35:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6b25c3e4-bc5f-425e-a087-d711c6f47dc9\\\",\\\"systemUUID\\\":\\\"271ae636-cfe7-445d-a59b-c26de3fa0170\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-01T19:35:27Z is after 2025-08-24T17:21:41Z" Feb 01 19:35:27 crc kubenswrapper[4961]: E0201 19:35:27.620095 4961 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.622429 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.622509 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.622536 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.622576 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.622606 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.726423 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.726495 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.726515 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.726543 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.726562 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.830349 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.830426 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.830453 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.830481 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.830499 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.934288 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.934370 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.934410 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.934431 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:27 crc kubenswrapper[4961]: I0201 19:35:27.934444 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:27Z","lastTransitionTime":"2026-02-01T19:35:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.037412 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.037490 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.037506 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.037534 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.037557 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:28Z","lastTransitionTime":"2026-02-01T19:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.142189 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.142261 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.142279 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.142304 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.142322 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:28Z","lastTransitionTime":"2026-02-01T19:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.238452 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.238505 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.238593 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:28 crc kubenswrapper[4961]: E0201 19:35:28.238762 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:28 crc kubenswrapper[4961]: E0201 19:35:28.239218 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:28 crc kubenswrapper[4961]: E0201 19:35:28.239415 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.240236 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 23:39:45.398021794 +0000 UTC Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.245086 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.245140 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.245159 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.245187 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.245206 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:28Z","lastTransitionTime":"2026-02-01T19:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.348517 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.348771 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.348870 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.348964 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.349075 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:28Z","lastTransitionTime":"2026-02-01T19:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.452591 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.452666 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.452684 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.452716 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.452736 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:28Z","lastTransitionTime":"2026-02-01T19:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.556382 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.556451 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.556468 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.556497 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.556516 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:28Z","lastTransitionTime":"2026-02-01T19:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.659834 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.659890 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.659907 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.659929 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.659946 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:28Z","lastTransitionTime":"2026-02-01T19:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.764444 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.765031 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.765125 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.765158 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.765178 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:28Z","lastTransitionTime":"2026-02-01T19:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.868079 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.868154 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.868179 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.868212 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.868234 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:28Z","lastTransitionTime":"2026-02-01T19:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.972283 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.972361 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.972378 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.972405 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:28 crc kubenswrapper[4961]: I0201 19:35:28.972426 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:28Z","lastTransitionTime":"2026-02-01T19:35:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.076269 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.076335 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.076353 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.076379 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.076401 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:29Z","lastTransitionTime":"2026-02-01T19:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.179798 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.179836 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.179845 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.179861 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.179869 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:29Z","lastTransitionTime":"2026-02-01T19:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.234411 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:29 crc kubenswrapper[4961]: E0201 19:35:29.234544 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.240607 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 00:38:48.178712936 +0000 UTC Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.281873 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.281924 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.281942 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.281966 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.281984 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:29Z","lastTransitionTime":"2026-02-01T19:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.384982 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.385338 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.385377 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.385412 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.385438 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:29Z","lastTransitionTime":"2026-02-01T19:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.488718 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.488785 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.488804 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.488832 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.488850 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:29Z","lastTransitionTime":"2026-02-01T19:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.591546 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.591616 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.591636 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.591664 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.591683 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:29Z","lastTransitionTime":"2026-02-01T19:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.694648 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.694737 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.694765 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.694799 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.694824 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:29Z","lastTransitionTime":"2026-02-01T19:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.796978 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.797062 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.797082 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.797108 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.797125 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:29Z","lastTransitionTime":"2026-02-01T19:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.900711 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.900779 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.900791 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.900812 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:29 crc kubenswrapper[4961]: I0201 19:35:29.900830 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:29Z","lastTransitionTime":"2026-02-01T19:35:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.004568 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.004643 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.004662 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.004689 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.004709 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:30Z","lastTransitionTime":"2026-02-01T19:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.107610 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.107691 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.107713 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.107747 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.107771 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:30Z","lastTransitionTime":"2026-02-01T19:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.181905 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182159 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.182097952 +0000 UTC m=+148.707129415 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.182243 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.182306 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.182366 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.182406 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182513 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182547 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182571 4961 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182611 4961 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182607 4961 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182624 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182737 4961 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182761 4961 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182649 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.182631696 +0000 UTC m=+148.707663159 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182846 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.182804831 +0000 UTC m=+148.707836384 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182894 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.182868262 +0000 UTC m=+148.707899975 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.182950 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.182930464 +0000 UTC m=+148.707962167 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.211692 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.211773 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.211799 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.211836 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.211865 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:30Z","lastTransitionTime":"2026-02-01T19:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.234602 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.234675 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.234760 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.234685 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.234855 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:30 crc kubenswrapper[4961]: E0201 19:35:30.234999 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.241240 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 08:07:59.787582068 +0000 UTC Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.315346 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.315428 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.315446 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.315477 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.315497 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:30Z","lastTransitionTime":"2026-02-01T19:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.419457 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.419525 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.419543 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.419573 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.419594 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:30Z","lastTransitionTime":"2026-02-01T19:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.522097 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.522580 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.522714 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.522820 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.522913 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:30Z","lastTransitionTime":"2026-02-01T19:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.626145 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.626421 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.626522 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.626619 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.626704 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:30Z","lastTransitionTime":"2026-02-01T19:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.729752 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.729821 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.729833 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.729849 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.729858 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:30Z","lastTransitionTime":"2026-02-01T19:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.831967 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.832254 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.832281 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.833199 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.833262 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:30Z","lastTransitionTime":"2026-02-01T19:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.935891 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.935942 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.935955 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.935975 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:30 crc kubenswrapper[4961]: I0201 19:35:30.935988 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:30Z","lastTransitionTime":"2026-02-01T19:35:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.038754 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.038820 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.038833 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.038850 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.038862 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:31Z","lastTransitionTime":"2026-02-01T19:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.142115 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.142202 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.142231 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.142264 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.142288 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:31Z","lastTransitionTime":"2026-02-01T19:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.234534 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:31 crc kubenswrapper[4961]: E0201 19:35:31.234909 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.241423 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 07:44:02.499886545 +0000 UTC Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.245494 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.245551 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.245574 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.245598 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.245617 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:31Z","lastTransitionTime":"2026-02-01T19:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.348552 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.348665 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.348685 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.348715 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.348736 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:31Z","lastTransitionTime":"2026-02-01T19:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.452209 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.452283 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.452301 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.452327 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.452344 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:31Z","lastTransitionTime":"2026-02-01T19:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.555612 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.555671 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.555688 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.555715 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.555734 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:31Z","lastTransitionTime":"2026-02-01T19:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.659105 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.659174 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.659196 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.659230 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.659252 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:31Z","lastTransitionTime":"2026-02-01T19:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.764261 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.764337 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.764360 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.764391 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.764413 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:31Z","lastTransitionTime":"2026-02-01T19:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.867090 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.867157 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.867182 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.867213 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.867239 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:31Z","lastTransitionTime":"2026-02-01T19:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.970524 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.970585 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.970605 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.970635 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:31 crc kubenswrapper[4961]: I0201 19:35:31.970656 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:31Z","lastTransitionTime":"2026-02-01T19:35:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.075233 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.075335 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.075363 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.075403 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.075429 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:32Z","lastTransitionTime":"2026-02-01T19:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.179361 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.179436 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.179455 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.179483 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.179505 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:32Z","lastTransitionTime":"2026-02-01T19:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.235013 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.235098 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.235229 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:32 crc kubenswrapper[4961]: E0201 19:35:32.235444 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:32 crc kubenswrapper[4961]: E0201 19:35:32.235610 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:32 crc kubenswrapper[4961]: E0201 19:35:32.235824 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.241975 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 11:01:34.328590348 +0000 UTC Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.283279 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.283359 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.283381 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.283413 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.283435 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:32Z","lastTransitionTime":"2026-02-01T19:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.387457 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.387536 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.387553 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.387582 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.387603 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:32Z","lastTransitionTime":"2026-02-01T19:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.490933 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.490994 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.491012 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.491066 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.491086 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:32Z","lastTransitionTime":"2026-02-01T19:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.594102 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.594168 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.594187 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.594216 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.594236 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:32Z","lastTransitionTime":"2026-02-01T19:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.697417 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.697489 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.697510 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.697537 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.697556 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:32Z","lastTransitionTime":"2026-02-01T19:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.801060 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.801126 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.801138 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.801164 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.801178 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:32Z","lastTransitionTime":"2026-02-01T19:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.905132 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.905213 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.905239 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.905280 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:32 crc kubenswrapper[4961]: I0201 19:35:32.905307 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:32Z","lastTransitionTime":"2026-02-01T19:35:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.008488 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.008544 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.008555 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.008571 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.008580 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:33Z","lastTransitionTime":"2026-02-01T19:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.111511 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.111569 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.111578 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.111597 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.111607 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:33Z","lastTransitionTime":"2026-02-01T19:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.215284 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.215344 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.215362 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.215387 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.215398 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:33Z","lastTransitionTime":"2026-02-01T19:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.234854 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:33 crc kubenswrapper[4961]: E0201 19:35:33.235158 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.242895 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 21:10:12.186256909 +0000 UTC Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.319251 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.319298 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.319308 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.319326 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.319338 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:33Z","lastTransitionTime":"2026-02-01T19:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.422605 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.422662 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.422673 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.422692 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.422705 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:33Z","lastTransitionTime":"2026-02-01T19:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.526309 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.526357 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.526366 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.526386 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.526403 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:33Z","lastTransitionTime":"2026-02-01T19:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.630092 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.630151 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.630162 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.630186 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.630200 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:33Z","lastTransitionTime":"2026-02-01T19:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.733317 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.733381 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.733395 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.733413 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.733426 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:33Z","lastTransitionTime":"2026-02-01T19:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.836721 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.836809 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.836837 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.836875 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.836905 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:33Z","lastTransitionTime":"2026-02-01T19:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.941428 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.941565 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.941641 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.941707 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:33 crc kubenswrapper[4961]: I0201 19:35:33.941732 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:33Z","lastTransitionTime":"2026-02-01T19:35:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.044518 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.044560 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.044575 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.044600 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.044615 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:34Z","lastTransitionTime":"2026-02-01T19:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.154705 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.154831 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.155098 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.155153 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.155234 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:34Z","lastTransitionTime":"2026-02-01T19:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.234665 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:34 crc kubenswrapper[4961]: E0201 19:35:34.234935 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.235414 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.235422 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:34 crc kubenswrapper[4961]: E0201 19:35:34.235655 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:34 crc kubenswrapper[4961]: E0201 19:35:34.235785 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.243906 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 04:56:47.270652619 +0000 UTC Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.258459 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.258519 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.258538 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.258562 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.258582 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:34Z","lastTransitionTime":"2026-02-01T19:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.363362 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.363435 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.363453 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.363481 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.363502 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:34Z","lastTransitionTime":"2026-02-01T19:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.467326 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.467388 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.467407 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.467433 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.467453 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:34Z","lastTransitionTime":"2026-02-01T19:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.571610 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.571691 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.571709 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.571739 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.571763 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:34Z","lastTransitionTime":"2026-02-01T19:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.675493 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.675576 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.675596 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.675628 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.675654 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:34Z","lastTransitionTime":"2026-02-01T19:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.779798 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.779864 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.779880 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.779906 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.779925 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:34Z","lastTransitionTime":"2026-02-01T19:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.883335 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.883421 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.883531 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.883606 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.883643 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:34Z","lastTransitionTime":"2026-02-01T19:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.988301 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.988391 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.988410 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.988440 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:34 crc kubenswrapper[4961]: I0201 19:35:34.988460 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:34Z","lastTransitionTime":"2026-02-01T19:35:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.091861 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.091942 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.091960 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.091993 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.092016 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:35Z","lastTransitionTime":"2026-02-01T19:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.195947 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.196020 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.196085 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.196114 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.196134 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:35Z","lastTransitionTime":"2026-02-01T19:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.234878 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:35 crc kubenswrapper[4961]: E0201 19:35:35.235129 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.244524 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 07:36:43.287969794 +0000 UTC Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.300596 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.300658 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.300667 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.300685 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.300696 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:35Z","lastTransitionTime":"2026-02-01T19:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.403446 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.403518 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.403536 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.403567 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.403586 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:35Z","lastTransitionTime":"2026-02-01T19:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.507370 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.507455 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.507476 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.507507 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.507529 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:35Z","lastTransitionTime":"2026-02-01T19:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.611426 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.611494 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.611512 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.611538 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.611556 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:35Z","lastTransitionTime":"2026-02-01T19:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.714533 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.714605 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.714623 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.714650 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.714672 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:35Z","lastTransitionTime":"2026-02-01T19:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.817679 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.817759 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.817779 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.817807 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.817827 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:35Z","lastTransitionTime":"2026-02-01T19:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.922646 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.922733 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.922757 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.922790 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:35 crc kubenswrapper[4961]: I0201 19:35:35.922812 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:35Z","lastTransitionTime":"2026-02-01T19:35:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.026718 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.026786 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.026799 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.026828 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.026843 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:36Z","lastTransitionTime":"2026-02-01T19:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.130114 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.130202 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.130229 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.130265 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.130293 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:36Z","lastTransitionTime":"2026-02-01T19:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.233092 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.233182 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.233214 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.233250 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.233278 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:36Z","lastTransitionTime":"2026-02-01T19:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.234096 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.234169 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:36 crc kubenswrapper[4961]: E0201 19:35:36.234346 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.234373 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:36 crc kubenswrapper[4961]: E0201 19:35:36.234692 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:36 crc kubenswrapper[4961]: E0201 19:35:36.234832 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.244718 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 18:45:44.445996048 +0000 UTC Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.309378 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5pnzb" podStartSLOduration=69.309338672 podStartE2EDuration="1m9.309338672s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:36.289108508 +0000 UTC m=+90.814139991" watchObservedRunningTime="2026-02-01 19:35:36.309338672 +0000 UTC m=+90.834370125" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.334734 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=21.334697112 podStartE2EDuration="21.334697112s" podCreationTimestamp="2026-02-01 19:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:36.309678701 +0000 UTC m=+90.834710124" watchObservedRunningTime="2026-02-01 19:35:36.334697112 +0000 UTC m=+90.859728565" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.337187 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.337236 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.337252 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.337278 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.337298 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:36Z","lastTransitionTime":"2026-02-01T19:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.358938 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c9pwd" podStartSLOduration=69.358913131 podStartE2EDuration="1m9.358913131s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:36.358218452 +0000 UTC m=+90.883249955" watchObservedRunningTime="2026-02-01 19:35:36.358913131 +0000 UTC m=+90.883944594" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.384235 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tpzkt" podStartSLOduration=69.384207569 podStartE2EDuration="1m9.384207569s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:36.381905958 +0000 UTC m=+90.906937381" watchObservedRunningTime="2026-02-01 19:35:36.384207569 +0000 UTC m=+90.909239002" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.424125 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podStartSLOduration=69.424087172 podStartE2EDuration="1m9.424087172s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:36.423833365 +0000 UTC m=+90.948864828" watchObservedRunningTime="2026-02-01 19:35:36.424087172 +0000 UTC m=+90.949118635" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.440198 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.440261 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.440276 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.440298 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.440314 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:36Z","lastTransitionTime":"2026-02-01T19:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.451729 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wcr65" podStartSLOduration=69.451690191 podStartE2EDuration="1m9.451690191s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:36.451441035 +0000 UTC m=+90.976472498" watchObservedRunningTime="2026-02-01 19:35:36.451690191 +0000 UTC m=+90.976721654" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.543741 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.543789 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.543800 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.543821 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.543832 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:36Z","lastTransitionTime":"2026-02-01T19:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.593800 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=70.593763763 podStartE2EDuration="1m10.593763763s" podCreationTimestamp="2026-02-01 19:34:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:36.592257823 +0000 UTC m=+91.117289236" watchObservedRunningTime="2026-02-01 19:35:36.593763763 +0000 UTC m=+91.118795226" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.620862 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.620828837 podStartE2EDuration="43.620828837s" podCreationTimestamp="2026-02-01 19:34:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:36.620308013 +0000 UTC m=+91.145339436" watchObservedRunningTime="2026-02-01 19:35:36.620828837 +0000 UTC m=+91.145860270" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.647875 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.647952 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.647974 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.648003 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.648019 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:36Z","lastTransitionTime":"2026-02-01T19:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.698279 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=68.698261732 podStartE2EDuration="1m8.698261732s" podCreationTimestamp="2026-02-01 19:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:36.696126825 +0000 UTC m=+91.221158248" watchObservedRunningTime="2026-02-01 19:35:36.698261732 +0000 UTC m=+91.223293155" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.698854 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6dzl2" podStartSLOduration=69.698850767 podStartE2EDuration="1m9.698850767s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:36.649983377 +0000 UTC m=+91.175014790" watchObservedRunningTime="2026-02-01 19:35:36.698850767 +0000 UTC m=+91.223882190" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.736683 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=65.736657946 podStartE2EDuration="1m5.736657946s" podCreationTimestamp="2026-02-01 19:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:36.733392599 +0000 UTC m=+91.258424032" watchObservedRunningTime="2026-02-01 19:35:36.736657946 +0000 UTC m=+91.261689389" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.750766 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.750809 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.750823 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.750840 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.750855 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:36Z","lastTransitionTime":"2026-02-01T19:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.853116 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.853658 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.853831 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.854000 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.854196 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:36Z","lastTransitionTime":"2026-02-01T19:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.957976 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.958063 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.958079 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.958101 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:36 crc kubenswrapper[4961]: I0201 19:35:36.958114 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:36Z","lastTransitionTime":"2026-02-01T19:35:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.061812 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.062004 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.062025 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.062083 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.062101 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:37Z","lastTransitionTime":"2026-02-01T19:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.165460 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.165559 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.165587 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.165622 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.165647 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:37Z","lastTransitionTime":"2026-02-01T19:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.234900 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:37 crc kubenswrapper[4961]: E0201 19:35:37.235154 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.245460 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 09:05:03.800691489 +0000 UTC Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.269682 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.269746 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.269765 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.269791 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.269811 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:37Z","lastTransitionTime":"2026-02-01T19:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.373295 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.373365 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.373382 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.373408 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.373425 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:37Z","lastTransitionTime":"2026-02-01T19:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.477521 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.477594 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.477615 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.477643 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.477664 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:37Z","lastTransitionTime":"2026-02-01T19:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.581292 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.581373 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.581393 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.581429 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.581459 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:37Z","lastTransitionTime":"2026-02-01T19:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.685302 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.685385 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.685411 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.685447 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.685477 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:37Z","lastTransitionTime":"2026-02-01T19:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.789113 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.789205 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.789224 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.789258 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.789280 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:37Z","lastTransitionTime":"2026-02-01T19:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.791292 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.791404 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.791428 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.791453 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.791473 4961 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-01T19:35:37Z","lastTransitionTime":"2026-02-01T19:35:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.865398 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg"] Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.866027 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.869394 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.869467 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.869713 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.869719 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.973696 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e9c45537-7620-4f4b-a315-26fc8fdbac24-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.974248 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e9c45537-7620-4f4b-a315-26fc8fdbac24-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.974318 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e9c45537-7620-4f4b-a315-26fc8fdbac24-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.974382 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9c45537-7620-4f4b-a315-26fc8fdbac24-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:37 crc kubenswrapper[4961]: I0201 19:35:37.974412 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9c45537-7620-4f4b-a315-26fc8fdbac24-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.076086 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e9c45537-7620-4f4b-a315-26fc8fdbac24-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.076166 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e9c45537-7620-4f4b-a315-26fc8fdbac24-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.076216 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e9c45537-7620-4f4b-a315-26fc8fdbac24-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.076259 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9c45537-7620-4f4b-a315-26fc8fdbac24-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.076283 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9c45537-7620-4f4b-a315-26fc8fdbac24-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.076418 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e9c45537-7620-4f4b-a315-26fc8fdbac24-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.076567 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e9c45537-7620-4f4b-a315-26fc8fdbac24-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.077330 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e9c45537-7620-4f4b-a315-26fc8fdbac24-service-ca\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.086580 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9c45537-7620-4f4b-a315-26fc8fdbac24-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.101730 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e9c45537-7620-4f4b-a315-26fc8fdbac24-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-9gxlg\" (UID: \"e9c45537-7620-4f4b-a315-26fc8fdbac24\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.198248 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.236466 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:38 crc kubenswrapper[4961]: E0201 19:35:38.236692 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.237272 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:38 crc kubenswrapper[4961]: E0201 19:35:38.237392 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.238099 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:38 crc kubenswrapper[4961]: E0201 19:35:38.238214 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.246274 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 02:13:02.649246168 +0000 UTC Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.246343 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.258738 4961 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.833792 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" event={"ID":"e9c45537-7620-4f4b-a315-26fc8fdbac24","Type":"ContainerStarted","Data":"0a5f0c4374a78a49631025106180e2dcf20cdfaecc0a0b17d420f5464f9745d2"} Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.833937 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" event={"ID":"e9c45537-7620-4f4b-a315-26fc8fdbac24","Type":"ContainerStarted","Data":"267237331cb657873275f8c6bd9ed53563d38194603cb372ce3184eac583d896"} Feb 01 19:35:38 crc kubenswrapper[4961]: I0201 19:35:38.856033 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-9gxlg" podStartSLOduration=71.856004277 podStartE2EDuration="1m11.856004277s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:35:38.854001054 +0000 UTC m=+93.379032517" watchObservedRunningTime="2026-02-01 19:35:38.856004277 +0000 UTC m=+93.381035730" Feb 01 19:35:39 crc kubenswrapper[4961]: I0201 19:35:39.234843 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:39 crc kubenswrapper[4961]: E0201 19:35:39.235130 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:39 crc kubenswrapper[4961]: I0201 19:35:39.236260 4961 scope.go:117] "RemoveContainer" containerID="cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170" Feb 01 19:35:39 crc kubenswrapper[4961]: E0201 19:35:39.236636 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" Feb 01 19:35:40 crc kubenswrapper[4961]: I0201 19:35:40.235154 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:40 crc kubenswrapper[4961]: I0201 19:35:40.235240 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:40 crc kubenswrapper[4961]: E0201 19:35:40.235327 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:40 crc kubenswrapper[4961]: I0201 19:35:40.235250 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:40 crc kubenswrapper[4961]: E0201 19:35:40.235459 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:40 crc kubenswrapper[4961]: E0201 19:35:40.235846 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:41 crc kubenswrapper[4961]: I0201 19:35:41.235162 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:41 crc kubenswrapper[4961]: E0201 19:35:41.235394 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:42 crc kubenswrapper[4961]: I0201 19:35:42.234858 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:42 crc kubenswrapper[4961]: I0201 19:35:42.234895 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:42 crc kubenswrapper[4961]: I0201 19:35:42.234931 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:42 crc kubenswrapper[4961]: E0201 19:35:42.235128 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:42 crc kubenswrapper[4961]: E0201 19:35:42.235339 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:42 crc kubenswrapper[4961]: E0201 19:35:42.235486 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:43 crc kubenswrapper[4961]: I0201 19:35:43.234204 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:43 crc kubenswrapper[4961]: E0201 19:35:43.234724 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:44 crc kubenswrapper[4961]: I0201 19:35:44.235254 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:44 crc kubenswrapper[4961]: I0201 19:35:44.235395 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:44 crc kubenswrapper[4961]: E0201 19:35:44.235472 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:44 crc kubenswrapper[4961]: E0201 19:35:44.235619 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:44 crc kubenswrapper[4961]: I0201 19:35:44.235255 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:44 crc kubenswrapper[4961]: E0201 19:35:44.235876 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:45 crc kubenswrapper[4961]: I0201 19:35:45.234819 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:45 crc kubenswrapper[4961]: E0201 19:35:45.235112 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:45 crc kubenswrapper[4961]: I0201 19:35:45.769777 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:45 crc kubenswrapper[4961]: E0201 19:35:45.770006 4961 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:35:45 crc kubenswrapper[4961]: E0201 19:35:45.770121 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs podName:b35577aa-3e17-4907-80d9-b8911ea55c25 nodeName:}" failed. No retries permitted until 2026-02-01 19:36:49.770094463 +0000 UTC m=+164.295125926 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs") pod "network-metrics-daemon-j6fs5" (UID: "b35577aa-3e17-4907-80d9-b8911ea55c25") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 01 19:35:46 crc kubenswrapper[4961]: I0201 19:35:46.234382 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:46 crc kubenswrapper[4961]: I0201 19:35:46.234400 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:46 crc kubenswrapper[4961]: E0201 19:35:46.236423 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:46 crc kubenswrapper[4961]: I0201 19:35:46.236464 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:46 crc kubenswrapper[4961]: E0201 19:35:46.236563 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:46 crc kubenswrapper[4961]: E0201 19:35:46.236666 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:47 crc kubenswrapper[4961]: I0201 19:35:47.234182 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:47 crc kubenswrapper[4961]: E0201 19:35:47.234346 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:48 crc kubenswrapper[4961]: I0201 19:35:48.234941 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:48 crc kubenswrapper[4961]: I0201 19:35:48.235272 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:48 crc kubenswrapper[4961]: E0201 19:35:48.235413 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:48 crc kubenswrapper[4961]: I0201 19:35:48.235463 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:48 crc kubenswrapper[4961]: E0201 19:35:48.235631 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:48 crc kubenswrapper[4961]: E0201 19:35:48.235747 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:49 crc kubenswrapper[4961]: I0201 19:35:49.234424 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:49 crc kubenswrapper[4961]: E0201 19:35:49.234983 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:50 crc kubenswrapper[4961]: I0201 19:35:50.234183 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:50 crc kubenswrapper[4961]: I0201 19:35:50.234279 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:50 crc kubenswrapper[4961]: I0201 19:35:50.234636 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:50 crc kubenswrapper[4961]: E0201 19:35:50.234813 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:50 crc kubenswrapper[4961]: E0201 19:35:50.234940 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:50 crc kubenswrapper[4961]: E0201 19:35:50.235391 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:51 crc kubenswrapper[4961]: I0201 19:35:51.234798 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:51 crc kubenswrapper[4961]: E0201 19:35:51.235095 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:52 crc kubenswrapper[4961]: I0201 19:35:52.235389 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:52 crc kubenswrapper[4961]: I0201 19:35:52.235456 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:52 crc kubenswrapper[4961]: E0201 19:35:52.235697 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:52 crc kubenswrapper[4961]: I0201 19:35:52.235746 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:52 crc kubenswrapper[4961]: E0201 19:35:52.235857 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:52 crc kubenswrapper[4961]: E0201 19:35:52.235953 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:52 crc kubenswrapper[4961]: I0201 19:35:52.238597 4961 scope.go:117] "RemoveContainer" containerID="cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170" Feb 01 19:35:52 crc kubenswrapper[4961]: E0201 19:35:52.238974 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" Feb 01 19:35:53 crc kubenswrapper[4961]: I0201 19:35:53.235200 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:53 crc kubenswrapper[4961]: E0201 19:35:53.235418 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:54 crc kubenswrapper[4961]: I0201 19:35:54.234809 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:54 crc kubenswrapper[4961]: I0201 19:35:54.234926 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:54 crc kubenswrapper[4961]: E0201 19:35:54.235113 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:54 crc kubenswrapper[4961]: I0201 19:35:54.235197 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:54 crc kubenswrapper[4961]: E0201 19:35:54.235391 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:54 crc kubenswrapper[4961]: E0201 19:35:54.235521 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:55 crc kubenswrapper[4961]: I0201 19:35:55.234604 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:55 crc kubenswrapper[4961]: E0201 19:35:55.234884 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:56 crc kubenswrapper[4961]: I0201 19:35:56.234820 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:56 crc kubenswrapper[4961]: E0201 19:35:56.235006 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:56 crc kubenswrapper[4961]: I0201 19:35:56.235633 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:56 crc kubenswrapper[4961]: I0201 19:35:56.235646 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:56 crc kubenswrapper[4961]: E0201 19:35:56.237641 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:56 crc kubenswrapper[4961]: E0201 19:35:56.237829 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:57 crc kubenswrapper[4961]: I0201 19:35:57.234461 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:57 crc kubenswrapper[4961]: E0201 19:35:57.235194 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:35:58 crc kubenswrapper[4961]: I0201 19:35:58.234707 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:35:58 crc kubenswrapper[4961]: I0201 19:35:58.234756 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:35:58 crc kubenswrapper[4961]: I0201 19:35:58.234898 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:35:58 crc kubenswrapper[4961]: E0201 19:35:58.235974 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:35:58 crc kubenswrapper[4961]: E0201 19:35:58.236066 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:35:58 crc kubenswrapper[4961]: E0201 19:35:58.236128 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:35:59 crc kubenswrapper[4961]: I0201 19:35:59.234836 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:35:59 crc kubenswrapper[4961]: E0201 19:35:59.234984 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:00 crc kubenswrapper[4961]: I0201 19:36:00.234601 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:00 crc kubenswrapper[4961]: I0201 19:36:00.234601 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:00 crc kubenswrapper[4961]: I0201 19:36:00.234632 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:00 crc kubenswrapper[4961]: E0201 19:36:00.234767 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:00 crc kubenswrapper[4961]: E0201 19:36:00.235189 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:00 crc kubenswrapper[4961]: E0201 19:36:00.235301 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:01 crc kubenswrapper[4961]: I0201 19:36:01.234481 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:01 crc kubenswrapper[4961]: E0201 19:36:01.234660 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:01 crc kubenswrapper[4961]: I0201 19:36:01.920738 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wcr65_9a7db599-5d41-4e19-b703-6bb56822b4ff/kube-multus/1.log" Feb 01 19:36:01 crc kubenswrapper[4961]: I0201 19:36:01.922251 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wcr65_9a7db599-5d41-4e19-b703-6bb56822b4ff/kube-multus/0.log" Feb 01 19:36:01 crc kubenswrapper[4961]: I0201 19:36:01.922332 4961 generic.go:334] "Generic (PLEG): container finished" podID="9a7db599-5d41-4e19-b703-6bb56822b4ff" containerID="fb41218a133691c40c5cf0418e394a5a62b3036af754f0990622a4fa579377c9" exitCode=1 Feb 01 19:36:01 crc kubenswrapper[4961]: I0201 19:36:01.922408 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wcr65" event={"ID":"9a7db599-5d41-4e19-b703-6bb56822b4ff","Type":"ContainerDied","Data":"fb41218a133691c40c5cf0418e394a5a62b3036af754f0990622a4fa579377c9"} Feb 01 19:36:01 crc kubenswrapper[4961]: I0201 19:36:01.922510 4961 scope.go:117] "RemoveContainer" containerID="254b780cf2204ec8211f410b26473dc86ade194e8c18bb9ddb8ef93c961b921f" Feb 01 19:36:01 crc kubenswrapper[4961]: I0201 19:36:01.923143 4961 scope.go:117] "RemoveContainer" containerID="fb41218a133691c40c5cf0418e394a5a62b3036af754f0990622a4fa579377c9" Feb 01 19:36:01 crc kubenswrapper[4961]: E0201 19:36:01.923446 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wcr65_openshift-multus(9a7db599-5d41-4e19-b703-6bb56822b4ff)\"" pod="openshift-multus/multus-wcr65" podUID="9a7db599-5d41-4e19-b703-6bb56822b4ff" Feb 01 19:36:02 crc kubenswrapper[4961]: I0201 19:36:02.235227 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:02 crc kubenswrapper[4961]: I0201 19:36:02.235324 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:02 crc kubenswrapper[4961]: E0201 19:36:02.235453 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:02 crc kubenswrapper[4961]: I0201 19:36:02.235511 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:02 crc kubenswrapper[4961]: E0201 19:36:02.235846 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:02 crc kubenswrapper[4961]: E0201 19:36:02.236001 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:02 crc kubenswrapper[4961]: I0201 19:36:02.929383 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wcr65_9a7db599-5d41-4e19-b703-6bb56822b4ff/kube-multus/1.log" Feb 01 19:36:03 crc kubenswrapper[4961]: I0201 19:36:03.234169 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:03 crc kubenswrapper[4961]: E0201 19:36:03.234394 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:03 crc kubenswrapper[4961]: I0201 19:36:03.235256 4961 scope.go:117] "RemoveContainer" containerID="cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170" Feb 01 19:36:03 crc kubenswrapper[4961]: E0201 19:36:03.235466 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ng92q_openshift-ovn-kubernetes(1dc5aa88-1337-4077-a820-f13902334c80)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" Feb 01 19:36:04 crc kubenswrapper[4961]: I0201 19:36:04.235294 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:04 crc kubenswrapper[4961]: I0201 19:36:04.235432 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:04 crc kubenswrapper[4961]: E0201 19:36:04.235484 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:04 crc kubenswrapper[4961]: I0201 19:36:04.235579 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:04 crc kubenswrapper[4961]: E0201 19:36:04.235728 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:04 crc kubenswrapper[4961]: E0201 19:36:04.235797 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:05 crc kubenswrapper[4961]: I0201 19:36:05.234794 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:05 crc kubenswrapper[4961]: E0201 19:36:05.234957 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:06 crc kubenswrapper[4961]: E0201 19:36:06.159664 4961 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 01 19:36:06 crc kubenswrapper[4961]: I0201 19:36:06.234253 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:06 crc kubenswrapper[4961]: I0201 19:36:06.234323 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:06 crc kubenswrapper[4961]: I0201 19:36:06.234421 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:06 crc kubenswrapper[4961]: E0201 19:36:06.236345 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:06 crc kubenswrapper[4961]: E0201 19:36:06.236466 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:06 crc kubenswrapper[4961]: E0201 19:36:06.236663 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:06 crc kubenswrapper[4961]: E0201 19:36:06.389877 4961 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 01 19:36:07 crc kubenswrapper[4961]: I0201 19:36:07.234235 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:07 crc kubenswrapper[4961]: E0201 19:36:07.234442 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:08 crc kubenswrapper[4961]: I0201 19:36:08.234838 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:08 crc kubenswrapper[4961]: I0201 19:36:08.234937 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:08 crc kubenswrapper[4961]: I0201 19:36:08.234965 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:08 crc kubenswrapper[4961]: E0201 19:36:08.235227 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:08 crc kubenswrapper[4961]: E0201 19:36:08.235382 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:08 crc kubenswrapper[4961]: E0201 19:36:08.235523 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:09 crc kubenswrapper[4961]: I0201 19:36:09.234241 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:09 crc kubenswrapper[4961]: E0201 19:36:09.234506 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:10 crc kubenswrapper[4961]: I0201 19:36:10.235196 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:10 crc kubenswrapper[4961]: I0201 19:36:10.235236 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:10 crc kubenswrapper[4961]: E0201 19:36:10.235389 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:10 crc kubenswrapper[4961]: E0201 19:36:10.235512 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:10 crc kubenswrapper[4961]: I0201 19:36:10.236461 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:10 crc kubenswrapper[4961]: E0201 19:36:10.236800 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:11 crc kubenswrapper[4961]: I0201 19:36:11.234674 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:11 crc kubenswrapper[4961]: E0201 19:36:11.234902 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:11 crc kubenswrapper[4961]: E0201 19:36:11.390982 4961 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 01 19:36:12 crc kubenswrapper[4961]: I0201 19:36:12.234469 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:12 crc kubenswrapper[4961]: I0201 19:36:12.234579 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:12 crc kubenswrapper[4961]: I0201 19:36:12.234490 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:12 crc kubenswrapper[4961]: E0201 19:36:12.234682 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:12 crc kubenswrapper[4961]: E0201 19:36:12.234812 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:12 crc kubenswrapper[4961]: E0201 19:36:12.234898 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:13 crc kubenswrapper[4961]: I0201 19:36:13.234508 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:13 crc kubenswrapper[4961]: E0201 19:36:13.234769 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:14 crc kubenswrapper[4961]: I0201 19:36:14.234497 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:14 crc kubenswrapper[4961]: E0201 19:36:14.235023 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:14 crc kubenswrapper[4961]: I0201 19:36:14.235094 4961 scope.go:117] "RemoveContainer" containerID="fb41218a133691c40c5cf0418e394a5a62b3036af754f0990622a4fa579377c9" Feb 01 19:36:14 crc kubenswrapper[4961]: I0201 19:36:14.235376 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:14 crc kubenswrapper[4961]: E0201 19:36:14.235550 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:14 crc kubenswrapper[4961]: I0201 19:36:14.235636 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:14 crc kubenswrapper[4961]: E0201 19:36:14.235756 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:14 crc kubenswrapper[4961]: I0201 19:36:14.979119 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wcr65_9a7db599-5d41-4e19-b703-6bb56822b4ff/kube-multus/1.log" Feb 01 19:36:14 crc kubenswrapper[4961]: I0201 19:36:14.979198 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wcr65" event={"ID":"9a7db599-5d41-4e19-b703-6bb56822b4ff","Type":"ContainerStarted","Data":"bd2cc8257fc155aebcaa80268feee1ab6dfb4db90cc47bc8926ec8cb0bb5fd06"} Feb 01 19:36:15 crc kubenswrapper[4961]: I0201 19:36:15.234605 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:15 crc kubenswrapper[4961]: E0201 19:36:15.234866 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:16 crc kubenswrapper[4961]: I0201 19:36:16.234463 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:16 crc kubenswrapper[4961]: I0201 19:36:16.234469 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:16 crc kubenswrapper[4961]: E0201 19:36:16.245513 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:16 crc kubenswrapper[4961]: I0201 19:36:16.246376 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:16 crc kubenswrapper[4961]: E0201 19:36:16.246744 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:16 crc kubenswrapper[4961]: E0201 19:36:16.246621 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:16 crc kubenswrapper[4961]: E0201 19:36:16.392371 4961 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 01 19:36:17 crc kubenswrapper[4961]: I0201 19:36:17.234966 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:17 crc kubenswrapper[4961]: E0201 19:36:17.235302 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:18 crc kubenswrapper[4961]: I0201 19:36:18.235154 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:18 crc kubenswrapper[4961]: E0201 19:36:18.235355 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:18 crc kubenswrapper[4961]: I0201 19:36:18.235853 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:18 crc kubenswrapper[4961]: E0201 19:36:18.236012 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:18 crc kubenswrapper[4961]: I0201 19:36:18.236330 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:18 crc kubenswrapper[4961]: I0201 19:36:18.236404 4961 scope.go:117] "RemoveContainer" containerID="cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170" Feb 01 19:36:18 crc kubenswrapper[4961]: E0201 19:36:18.236495 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:18 crc kubenswrapper[4961]: I0201 19:36:18.996117 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/3.log" Feb 01 19:36:18 crc kubenswrapper[4961]: I0201 19:36:18.999255 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerStarted","Data":"82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4"} Feb 01 19:36:19 crc kubenswrapper[4961]: I0201 19:36:18.999949 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:36:19 crc kubenswrapper[4961]: I0201 19:36:19.033602 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podStartSLOduration=112.03356966 podStartE2EDuration="1m52.03356966s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:19.031725219 +0000 UTC m=+133.556756642" watchObservedRunningTime="2026-02-01 19:36:19.03356966 +0000 UTC m=+133.558601083" Feb 01 19:36:19 crc kubenswrapper[4961]: I0201 19:36:19.182255 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j6fs5"] Feb 01 19:36:19 crc kubenswrapper[4961]: I0201 19:36:19.182368 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:19 crc kubenswrapper[4961]: E0201 19:36:19.182463 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:20 crc kubenswrapper[4961]: I0201 19:36:20.234249 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:20 crc kubenswrapper[4961]: I0201 19:36:20.234308 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:20 crc kubenswrapper[4961]: I0201 19:36:20.234432 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:20 crc kubenswrapper[4961]: E0201 19:36:20.234440 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:20 crc kubenswrapper[4961]: E0201 19:36:20.234598 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:20 crc kubenswrapper[4961]: E0201 19:36:20.234709 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:21 crc kubenswrapper[4961]: I0201 19:36:21.234794 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:21 crc kubenswrapper[4961]: E0201 19:36:21.234934 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:21 crc kubenswrapper[4961]: E0201 19:36:21.393592 4961 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 01 19:36:22 crc kubenswrapper[4961]: I0201 19:36:22.234627 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:22 crc kubenswrapper[4961]: I0201 19:36:22.234750 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:22 crc kubenswrapper[4961]: E0201 19:36:22.234823 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:22 crc kubenswrapper[4961]: E0201 19:36:22.234932 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:22 crc kubenswrapper[4961]: I0201 19:36:22.234746 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:22 crc kubenswrapper[4961]: E0201 19:36:22.235109 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:23 crc kubenswrapper[4961]: I0201 19:36:23.234738 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:23 crc kubenswrapper[4961]: E0201 19:36:23.235818 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:24 crc kubenswrapper[4961]: I0201 19:36:24.234557 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:24 crc kubenswrapper[4961]: I0201 19:36:24.234650 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:24 crc kubenswrapper[4961]: E0201 19:36:24.234790 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:24 crc kubenswrapper[4961]: E0201 19:36:24.234923 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:24 crc kubenswrapper[4961]: I0201 19:36:24.235200 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:24 crc kubenswrapper[4961]: E0201 19:36:24.235587 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:25 crc kubenswrapper[4961]: I0201 19:36:25.235066 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:25 crc kubenswrapper[4961]: E0201 19:36:25.235228 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j6fs5" podUID="b35577aa-3e17-4907-80d9-b8911ea55c25" Feb 01 19:36:26 crc kubenswrapper[4961]: I0201 19:36:26.234739 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:26 crc kubenswrapper[4961]: I0201 19:36:26.234964 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:26 crc kubenswrapper[4961]: E0201 19:36:26.237174 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 01 19:36:26 crc kubenswrapper[4961]: I0201 19:36:26.237223 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:26 crc kubenswrapper[4961]: E0201 19:36:26.237542 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 01 19:36:26 crc kubenswrapper[4961]: E0201 19:36:26.237426 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 01 19:36:27 crc kubenswrapper[4961]: I0201 19:36:27.234464 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:27 crc kubenswrapper[4961]: I0201 19:36:27.236443 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 01 19:36:27 crc kubenswrapper[4961]: I0201 19:36:27.237229 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.234837 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.234837 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.234920 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.239026 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.239092 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.239178 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.241404 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.427513 4961 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.478371 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nbv69"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.485760 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.487786 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xppsd"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.490091 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.490447 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.507389 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.507932 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.508269 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.509722 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.510828 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-68b9n"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.511499 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.511534 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.511917 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.512369 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.512777 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.513022 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.514819 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.515397 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.516175 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.516710 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.516893 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.528724 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-597bm"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.529121 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.529558 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.530128 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.530304 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.530777 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.531419 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.534369 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.534648 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.534734 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.535077 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.535171 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.535549 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.535595 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.535617 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9cfh9"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.535966 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.536947 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-v684w"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.537494 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-v684w" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.539713 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.539877 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.542146 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.542300 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.542409 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.542525 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.542622 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.542724 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.542747 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.542833 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.542905 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.542953 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.543087 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.543125 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.543920 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.545505 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.547335 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.548012 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.548144 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-dw6q4"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.549016 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.549142 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.550565 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.551406 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.551527 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.552318 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.552328 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.552403 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.552344 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.552589 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.552606 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.552778 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553016 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553211 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553318 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553440 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553454 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553572 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553608 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553663 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553702 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553769 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553836 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553902 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.553908 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.554018 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.554231 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.554449 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.575992 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pn8rk"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.577058 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nbv69"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.577183 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.592778 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xppsd"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.594360 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.606919 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.607338 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hwdhh"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.608105 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-68b9n"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.608266 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.613369 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.613802 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.615582 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.615896 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.616141 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.616295 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.618385 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.618463 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.618670 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.618380 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.618916 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.619014 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.619254 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.619467 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.619623 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.619744 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.619923 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.620120 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.620254 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.620384 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.620484 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.620624 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.620666 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.620766 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.620968 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.621012 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.622549 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.622729 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.626184 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.628884 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.629087 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnz77"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.629190 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.629296 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.629562 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.629891 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.629964 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.630190 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.630236 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9w5c5"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.631120 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.631167 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.635743 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.636277 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.636702 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.631247 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.632506 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.632588 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.634603 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.638018 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-config\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656199 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89fc5280-7863-4e88-bbe6-b6f82128d057-config\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656284 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e705ec1a-dabb-4f4f-942a-66626c9c65d4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-898rx\" (UID: \"e705ec1a-dabb-4f4f-942a-66626c9c65d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656320 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fc5280-7863-4e88-bbe6-b6f82128d057-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656351 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20a2004c-d84f-4be5-b205-70e27386ad37-audit-dir\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656378 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-client-ca\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656413 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-config\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656444 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656473 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656496 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l5xd\" (UniqueName: \"kubernetes.io/projected/e705ec1a-dabb-4f4f-942a-66626c9c65d4-kube-api-access-2l5xd\") pod \"openshift-apiserver-operator-796bbdcf4f-898rx\" (UID: \"e705ec1a-dabb-4f4f-942a-66626c9c65d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656515 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fc5280-7863-4e88-bbe6-b6f82128d057-service-ca-bundle\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656551 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckss\" (UniqueName: \"kubernetes.io/projected/adda0b5e-49cb-4a97-a76c-0f19caddf65a-kube-api-access-gckss\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656576 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656599 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656636 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcxq2\" (UniqueName: \"kubernetes.io/projected/20a2004c-d84f-4be5-b205-70e27386ad37-kube-api-access-bcxq2\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656662 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9537efb-5e4e-414f-a0f2-7e566511b95c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fxmpb\" (UID: \"d9537efb-5e4e-414f-a0f2-7e566511b95c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656691 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ed21892-8bb6-46f1-a593-16b869fb63d3-serving-cert\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656716 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad1ae82a-4644-4f94-bc94-14605d3acfaa-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656740 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656767 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad1ae82a-4644-4f94-bc94-14605d3acfaa-serving-cert\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656791 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-audit-policies\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656817 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656841 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrmgl\" (UniqueName: \"kubernetes.io/projected/613fa395-90dd-4c22-8326-57e4dcb5f79f-kube-api-access-rrmgl\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656862 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad1ae82a-4644-4f94-bc94-14605d3acfaa-audit-dir\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656911 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656932 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad1ae82a-4644-4f94-bc94-14605d3acfaa-encryption-config\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656955 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ed21892-8bb6-46f1-a593-16b869fb63d3-node-pullsecrets\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656999 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8ed21892-8bb6-46f1-a593-16b869fb63d3-etcd-client\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657021 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.640944 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657072 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657097 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v27jk\" (UniqueName: \"kubernetes.io/projected/d9537efb-5e4e-414f-a0f2-7e566511b95c-kube-api-access-v27jk\") pod \"cluster-samples-operator-665b6dd947-fxmpb\" (UID: \"d9537efb-5e4e-414f-a0f2-7e566511b95c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657122 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qgjq\" (UniqueName: \"kubernetes.io/projected/a6e8e0a4-fb78-4de0-8a48-3283e2911ac0-kube-api-access-9qgjq\") pod \"downloads-7954f5f757-v684w\" (UID: \"a6e8e0a4-fb78-4de0-8a48-3283e2911ac0\") " pod="openshift-console/downloads-7954f5f757-v684w" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657149 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8ed21892-8bb6-46f1-a593-16b869fb63d3-audit-dir\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657182 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8ed21892-8bb6-46f1-a593-16b869fb63d3-encryption-config\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657203 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657509 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-etcd-serving-ca\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657549 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.641549 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657634 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/613fa395-90dd-4c22-8326-57e4dcb5f79f-serving-cert\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657665 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1bf30422-ee8e-4982-948c-99fd27354727-machine-approver-tls\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657685 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl775\" (UniqueName: \"kubernetes.io/projected/ad1ae82a-4644-4f94-bc94-14605d3acfaa-kube-api-access-kl775\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657760 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89fc5280-7863-4e88-bbe6-b6f82128d057-serving-cert\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657832 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-audit\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.657884 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad1ae82a-4644-4f94-bc94-14605d3acfaa-etcd-client\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.642457 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.658094 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.658165 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.658187 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.658209 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad1ae82a-4644-4f94-bc94-14605d3acfaa-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.658225 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.642627 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.658520 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.662670 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6wk4\" (UniqueName: \"kubernetes.io/projected/1bf30422-ee8e-4982-948c-99fd27354727-kube-api-access-t6wk4\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.662779 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swpmp\" (UniqueName: \"kubernetes.io/projected/89fc5280-7863-4e88-bbe6-b6f82128d057-kube-api-access-swpmp\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.662871 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf30422-ee8e-4982-948c-99fd27354727-config\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.662985 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adda0b5e-49cb-4a97-a76c-0f19caddf65a-serving-cert\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.663121 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bwnk\" (UniqueName: \"kubernetes.io/projected/8ed21892-8bb6-46f1-a593-16b869fb63d3-kube-api-access-9bwnk\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.663222 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-config\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.663301 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-config\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.663381 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m54l\" (UniqueName: \"kubernetes.io/projected/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-kube-api-access-7m54l\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.663465 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1bf30422-ee8e-4982-948c-99fd27354727-auth-proxy-config\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.663545 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad1ae82a-4644-4f94-bc94-14605d3acfaa-audit-policies\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.663715 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-image-import-ca\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.663805 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-images\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.663883 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e705ec1a-dabb-4f4f-942a-66626c9c65d4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-898rx\" (UID: \"e705ec1a-dabb-4f4f-942a-66626c9c65d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.663966 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-client-ca\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.656506 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.661861 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.659135 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.662111 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.669502 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.670550 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.670815 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.671846 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.672967 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.673373 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.677333 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-tjxcx"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.685899 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-v684w"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.686102 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.692645 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.694659 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dw6q4"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.695958 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.696186 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.698187 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.698210 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.698399 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.698844 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-597bm"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.699853 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pn8rk"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.701971 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.702839 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.703274 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.703694 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.704493 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rk9ts"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.704911 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.705421 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.705819 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.706454 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.707160 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.707426 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gl2kg"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.708057 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.708224 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.708540 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.709196 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.710160 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.710767 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.711184 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.712014 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.712209 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.713300 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hwdhh"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.714312 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9cfh9"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.715295 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.716462 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.717442 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.718629 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zdws9"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.719149 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.719339 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mjk8q"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.719969 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.721768 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-48wsp"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.722794 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.723136 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.725439 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnz77"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.725597 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.726316 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.727663 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9w5c5"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.728853 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.728859 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.730770 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lc945"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.731334 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.731445 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lc945" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.732549 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.733668 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.734782 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.740416 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rk9ts"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.742760 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.749587 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.750102 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.752194 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.753494 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.754906 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.756183 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mjk8q"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.757530 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gl2kg"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.758907 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-48wsp"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.760106 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.761345 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zdws9"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.762487 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.763699 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wgjjb"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.764815 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765064 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad1ae82a-4644-4f94-bc94-14605d3acfaa-encryption-config\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765092 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ed21892-8bb6-46f1-a593-16b869fb63d3-node-pullsecrets\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765124 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5095491b-50cd-4181-97ac-da46cc4608ca-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fl9kc\" (UID: \"5095491b-50cd-4181-97ac-da46cc4608ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765152 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ht7x\" (UniqueName: \"kubernetes.io/projected/fb1052b2-d1b5-4437-b80c-0b9112d041c8-kube-api-access-9ht7x\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765177 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11a86cbd-91cd-4966-a44d-423fbdb252d7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7hprk\" (UID: \"11a86cbd-91cd-4966-a44d-423fbdb252d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765202 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-trusted-ca-bundle\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765226 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8ed21892-8bb6-46f1-a593-16b869fb63d3-etcd-client\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765249 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765274 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11a86cbd-91cd-4966-a44d-423fbdb252d7-config\") pod \"kube-controller-manager-operator-78b949d7b-7hprk\" (UID: \"11a86cbd-91cd-4966-a44d-423fbdb252d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765298 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b5cc4ba-045f-4c69-88c9-763af824474d-trusted-ca\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765322 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-oauth-serving-cert\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765348 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765372 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v27jk\" (UniqueName: \"kubernetes.io/projected/d9537efb-5e4e-414f-a0f2-7e566511b95c-kube-api-access-v27jk\") pod \"cluster-samples-operator-665b6dd947-fxmpb\" (UID: \"d9537efb-5e4e-414f-a0f2-7e566511b95c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765397 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qgjq\" (UniqueName: \"kubernetes.io/projected/a6e8e0a4-fb78-4de0-8a48-3283e2911ac0-kube-api-access-9qgjq\") pod \"downloads-7954f5f757-v684w\" (UID: \"a6e8e0a4-fb78-4de0-8a48-3283e2911ac0\") " pod="openshift-console/downloads-7954f5f757-v684w" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765422 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgwct\" (UniqueName: \"kubernetes.io/projected/5420df36-687d-4d73-9628-3ebe3597e733-kube-api-access-fgwct\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765448 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8ed21892-8bb6-46f1-a593-16b869fb63d3-audit-dir\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765528 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8ed21892-8bb6-46f1-a593-16b869fb63d3-encryption-config\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765554 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765596 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01c9f61a-85b0-499a-9885-1a02903ef457-console-serving-cert\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765623 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5mp\" (UniqueName: \"kubernetes.io/projected/b163a77e-e04e-46ab-ac6c-bfadbe238992-kube-api-access-mc5mp\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765686 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5420df36-687d-4d73-9628-3ebe3597e733-etcd-service-ca\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765711 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-etcd-serving-ca\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765740 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765768 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/613fa395-90dd-4c22-8326-57e4dcb5f79f-serving-cert\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765798 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1bf30422-ee8e-4982-948c-99fd27354727-machine-approver-tls\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765823 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl775\" (UniqueName: \"kubernetes.io/projected/ad1ae82a-4644-4f94-bc94-14605d3acfaa-kube-api-access-kl775\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765850 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89fc5280-7863-4e88-bbe6-b6f82128d057-serving-cert\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765887 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-audit\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765911 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad1ae82a-4644-4f94-bc94-14605d3acfaa-etcd-client\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.765990 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5ldh\" (UniqueName: \"kubernetes.io/projected/01c9f61a-85b0-499a-9885-1a02903ef457-kube-api-access-l5ldh\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766000 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8ed21892-8bb6-46f1-a593-16b869fb63d3-node-pullsecrets\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766023 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766160 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766204 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad1ae82a-4644-4f94-bc94-14605d3acfaa-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766247 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b163a77e-e04e-46ab-ac6c-bfadbe238992-trusted-ca\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766291 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766325 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6wk4\" (UniqueName: \"kubernetes.io/projected/1bf30422-ee8e-4982-948c-99fd27354727-kube-api-access-t6wk4\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766357 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swpmp\" (UniqueName: \"kubernetes.io/projected/89fc5280-7863-4e88-bbe6-b6f82128d057-kube-api-access-swpmp\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766392 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb1052b2-d1b5-4437-b80c-0b9112d041c8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766435 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf30422-ee8e-4982-948c-99fd27354727-config\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766468 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/badc8c22-a530-44b6-9dfc-9fba07b633e8-metrics-tls\") pod \"dns-operator-744455d44c-hwdhh\" (UID: \"badc8c22-a530-44b6-9dfc-9fba07b633e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766532 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adda0b5e-49cb-4a97-a76c-0f19caddf65a-serving-cert\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766578 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bwnk\" (UniqueName: \"kubernetes.io/projected/8ed21892-8bb6-46f1-a593-16b869fb63d3-kube-api-access-9bwnk\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766603 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-etcd-serving-ca\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766615 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b163a77e-e04e-46ab-ac6c-bfadbe238992-serving-cert\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766660 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5420df36-687d-4d73-9628-3ebe3597e733-etcd-client\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766674 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8ed21892-8bb6-46f1-a593-16b869fb63d3-audit-dir\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766708 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-config\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766746 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1052b2-d1b5-4437-b80c-0b9112d041c8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766784 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-service-ca\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766822 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-config\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766857 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m54l\" (UniqueName: \"kubernetes.io/projected/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-kube-api-access-7m54l\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766894 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1bf30422-ee8e-4982-948c-99fd27354727-auth-proxy-config\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766926 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad1ae82a-4644-4f94-bc94-14605d3acfaa-audit-policies\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766965 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb1052b2-d1b5-4437-b80c-0b9112d041c8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.766999 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5420df36-687d-4d73-9628-3ebe3597e733-serving-cert\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767064 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767080 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-image-import-ca\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767115 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-images\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767155 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e705ec1a-dabb-4f4f-942a-66626c9c65d4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-898rx\" (UID: \"e705ec1a-dabb-4f4f-942a-66626c9c65d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767191 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b163a77e-e04e-46ab-ac6c-bfadbe238992-config\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767249 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-client-ca\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767288 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb6lh\" (UniqueName: \"kubernetes.io/projected/5095491b-50cd-4181-97ac-da46cc4608ca-kube-api-access-nb6lh\") pod \"openshift-config-operator-7777fb866f-fl9kc\" (UID: \"5095491b-50cd-4181-97ac-da46cc4608ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767336 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-config\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767370 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89fc5280-7863-4e88-bbe6-b6f82128d057-config\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767411 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e705ec1a-dabb-4f4f-942a-66626c9c65d4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-898rx\" (UID: \"e705ec1a-dabb-4f4f-942a-66626c9c65d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767443 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fc5280-7863-4e88-bbe6-b6f82128d057-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767474 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20a2004c-d84f-4be5-b205-70e27386ad37-audit-dir\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767578 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-client-ca\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767758 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-config\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767842 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767882 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767916 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l5xd\" (UniqueName: \"kubernetes.io/projected/e705ec1a-dabb-4f4f-942a-66626c9c65d4-kube-api-access-2l5xd\") pod \"openshift-apiserver-operator-796bbdcf4f-898rx\" (UID: \"e705ec1a-dabb-4f4f-942a-66626c9c65d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767949 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fc5280-7863-4e88-bbe6-b6f82128d057-service-ca-bundle\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767950 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.767982 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11a86cbd-91cd-4966-a44d-423fbdb252d7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7hprk\" (UID: \"11a86cbd-91cd-4966-a44d-423fbdb252d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.768016 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdgr2\" (UniqueName: \"kubernetes.io/projected/badc8c22-a530-44b6-9dfc-9fba07b633e8-kube-api-access-xdgr2\") pod \"dns-operator-744455d44c-hwdhh\" (UID: \"badc8c22-a530-44b6-9dfc-9fba07b633e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.768065 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5095491b-50cd-4181-97ac-da46cc4608ca-serving-cert\") pod \"openshift-config-operator-7777fb866f-fl9kc\" (UID: \"5095491b-50cd-4181-97ac-da46cc4608ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.768110 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6756v\" (UniqueName: \"kubernetes.io/projected/4b5cc4ba-045f-4c69-88c9-763af824474d-kube-api-access-6756v\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.768140 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5420df36-687d-4d73-9628-3ebe3597e733-etcd-ca\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.768934 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gckss\" (UniqueName: \"kubernetes.io/projected/adda0b5e-49cb-4a97-a76c-0f19caddf65a-kube-api-access-gckss\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.768980 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.769013 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.769063 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcxq2\" (UniqueName: \"kubernetes.io/projected/20a2004c-d84f-4be5-b205-70e27386ad37-kube-api-access-bcxq2\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.769119 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9537efb-5e4e-414f-a0f2-7e566511b95c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fxmpb\" (UID: \"d9537efb-5e4e-414f-a0f2-7e566511b95c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.770003 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-lczb5"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.770336 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-client-ca\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.770614 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lc945"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.770636 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgjjb"] Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.770726 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lczb5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.770850 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad1ae82a-4644-4f94-bc94-14605d3acfaa-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.771261 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.772323 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-image-import-ca\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.773079 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ad1ae82a-4644-4f94-bc94-14605d3acfaa-audit-policies\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.773691 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8ed21892-8bb6-46f1-a593-16b869fb63d3-etcd-client\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.774071 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.774375 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-config\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.774793 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1bf30422-ee8e-4982-948c-99fd27354727-auth-proxy-config\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.775332 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.775602 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-client-ca\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.776033 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89fc5280-7863-4e88-bbe6-b6f82128d057-config\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.776312 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-config\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.776350 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20a2004c-d84f-4be5-b205-70e27386ad37-audit-dir\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.774594 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf30422-ee8e-4982-948c-99fd27354727-config\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.776888 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.777269 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.778339 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fc5280-7863-4e88-bbe6-b6f82128d057-service-ca-bundle\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.779181 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ed21892-8bb6-46f1-a593-16b869fb63d3-serving-cert\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.779274 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5420df36-687d-4d73-9628-3ebe3597e733-config\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.779310 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b5cc4ba-045f-4c69-88c9-763af824474d-metrics-tls\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.779428 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01c9f61a-85b0-499a-9885-1a02903ef457-console-oauth-config\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.779529 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad1ae82a-4644-4f94-bc94-14605d3acfaa-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.779607 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b5cc4ba-045f-4c69-88c9-763af824474d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.780666 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.780806 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad1ae82a-4644-4f94-bc94-14605d3acfaa-serving-cert\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.780854 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-console-config\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.780890 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-audit-policies\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.780921 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.781317 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrmgl\" (UniqueName: \"kubernetes.io/projected/613fa395-90dd-4c22-8326-57e4dcb5f79f-kube-api-access-rrmgl\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.781369 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad1ae82a-4644-4f94-bc94-14605d3acfaa-audit-dir\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.781920 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ad1ae82a-4644-4f94-bc94-14605d3acfaa-audit-dir\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.782743 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.783619 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.783819 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ad1ae82a-4644-4f94-bc94-14605d3acfaa-encryption-config\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.783873 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ad1ae82a-4644-4f94-bc94-14605d3acfaa-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.784531 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89fc5280-7863-4e88-bbe6-b6f82128d057-serving-cert\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.785416 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ad1ae82a-4644-4f94-bc94-14605d3acfaa-etcd-client\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.785743 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89fc5280-7863-4e88-bbe6-b6f82128d057-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.785731 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-images\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.786523 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-audit-policies\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.786701 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-config\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.786856 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.787616 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.787858 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.787862 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.788249 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.788473 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adda0b5e-49cb-4a97-a76c-0f19caddf65a-serving-cert\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.788679 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.788678 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-config\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.789020 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e705ec1a-dabb-4f4f-942a-66626c9c65d4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-898rx\" (UID: \"e705ec1a-dabb-4f4f-942a-66626c9c65d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.789059 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.789081 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1bf30422-ee8e-4982-948c-99fd27354727-machine-approver-tls\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.789748 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8ed21892-8bb6-46f1-a593-16b869fb63d3-encryption-config\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.790119 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8ed21892-8bb6-46f1-a593-16b869fb63d3-audit\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.790275 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/613fa395-90dd-4c22-8326-57e4dcb5f79f-serving-cert\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.790301 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.790669 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad1ae82a-4644-4f94-bc94-14605d3acfaa-serving-cert\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.791088 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ed21892-8bb6-46f1-a593-16b869fb63d3-serving-cert\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.791674 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9537efb-5e4e-414f-a0f2-7e566511b95c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fxmpb\" (UID: \"d9537efb-5e4e-414f-a0f2-7e566511b95c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.791718 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e705ec1a-dabb-4f4f-942a-66626c9c65d4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-898rx\" (UID: \"e705ec1a-dabb-4f4f-942a-66626c9c65d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.811803 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.828975 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.849228 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.869555 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883583 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-service-ca\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883637 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb1052b2-d1b5-4437-b80c-0b9112d041c8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883660 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5420df36-687d-4d73-9628-3ebe3597e733-serving-cert\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883693 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b163a77e-e04e-46ab-ac6c-bfadbe238992-config\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883718 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb6lh\" (UniqueName: \"kubernetes.io/projected/5095491b-50cd-4181-97ac-da46cc4608ca-kube-api-access-nb6lh\") pod \"openshift-config-operator-7777fb866f-fl9kc\" (UID: \"5095491b-50cd-4181-97ac-da46cc4608ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883747 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11a86cbd-91cd-4966-a44d-423fbdb252d7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7hprk\" (UID: \"11a86cbd-91cd-4966-a44d-423fbdb252d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883765 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdgr2\" (UniqueName: \"kubernetes.io/projected/badc8c22-a530-44b6-9dfc-9fba07b633e8-kube-api-access-xdgr2\") pod \"dns-operator-744455d44c-hwdhh\" (UID: \"badc8c22-a530-44b6-9dfc-9fba07b633e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883785 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5095491b-50cd-4181-97ac-da46cc4608ca-serving-cert\") pod \"openshift-config-operator-7777fb866f-fl9kc\" (UID: \"5095491b-50cd-4181-97ac-da46cc4608ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883804 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5420df36-687d-4d73-9628-3ebe3597e733-etcd-ca\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883822 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6756v\" (UniqueName: \"kubernetes.io/projected/4b5cc4ba-045f-4c69-88c9-763af824474d-kube-api-access-6756v\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883853 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5420df36-687d-4d73-9628-3ebe3597e733-config\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883871 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b5cc4ba-045f-4c69-88c9-763af824474d-metrics-tls\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883916 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01c9f61a-85b0-499a-9885-1a02903ef457-console-oauth-config\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883939 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b5cc4ba-045f-4c69-88c9-763af824474d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883960 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-console-config\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.883997 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5095491b-50cd-4181-97ac-da46cc4608ca-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fl9kc\" (UID: \"5095491b-50cd-4181-97ac-da46cc4608ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884016 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ht7x\" (UniqueName: \"kubernetes.io/projected/fb1052b2-d1b5-4437-b80c-0b9112d041c8-kube-api-access-9ht7x\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884063 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11a86cbd-91cd-4966-a44d-423fbdb252d7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7hprk\" (UID: \"11a86cbd-91cd-4966-a44d-423fbdb252d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884087 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-trusted-ca-bundle\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884106 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11a86cbd-91cd-4966-a44d-423fbdb252d7-config\") pod \"kube-controller-manager-operator-78b949d7b-7hprk\" (UID: \"11a86cbd-91cd-4966-a44d-423fbdb252d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884125 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b5cc4ba-045f-4c69-88c9-763af824474d-trusted-ca\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884142 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-oauth-serving-cert\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884173 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgwct\" (UniqueName: \"kubernetes.io/projected/5420df36-687d-4d73-9628-3ebe3597e733-kube-api-access-fgwct\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884216 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01c9f61a-85b0-499a-9885-1a02903ef457-console-serving-cert\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884235 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5mp\" (UniqueName: \"kubernetes.io/projected/b163a77e-e04e-46ab-ac6c-bfadbe238992-kube-api-access-mc5mp\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884253 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5420df36-687d-4d73-9628-3ebe3597e733-etcd-service-ca\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884299 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5ldh\" (UniqueName: \"kubernetes.io/projected/01c9f61a-85b0-499a-9885-1a02903ef457-kube-api-access-l5ldh\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884323 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b163a77e-e04e-46ab-ac6c-bfadbe238992-trusted-ca\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884358 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb1052b2-d1b5-4437-b80c-0b9112d041c8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884377 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/badc8c22-a530-44b6-9dfc-9fba07b633e8-metrics-tls\") pod \"dns-operator-744455d44c-hwdhh\" (UID: \"badc8c22-a530-44b6-9dfc-9fba07b633e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884409 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b163a77e-e04e-46ab-ac6c-bfadbe238992-serving-cert\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884428 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5420df36-687d-4d73-9628-3ebe3597e733-etcd-client\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884452 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1052b2-d1b5-4437-b80c-0b9112d041c8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.884594 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5420df36-687d-4d73-9628-3ebe3597e733-etcd-ca\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.885516 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-service-ca\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.886780 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5095491b-50cd-4181-97ac-da46cc4608ca-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fl9kc\" (UID: \"5095491b-50cd-4181-97ac-da46cc4608ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.886829 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b5cc4ba-045f-4c69-88c9-763af824474d-trusted-ca\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.887190 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-console-config\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.887915 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5420df36-687d-4d73-9628-3ebe3597e733-config\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.888613 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-oauth-serving-cert\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.888837 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5420df36-687d-4d73-9628-3ebe3597e733-etcd-service-ca\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.889769 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01c9f61a-85b0-499a-9885-1a02903ef457-trusted-ca-bundle\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.889776 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fb1052b2-d1b5-4437-b80c-0b9112d041c8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.889919 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01c9f61a-85b0-499a-9885-1a02903ef457-console-serving-cert\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.890466 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b163a77e-e04e-46ab-ac6c-bfadbe238992-trusted-ca\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.890955 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b163a77e-e04e-46ab-ac6c-bfadbe238992-serving-cert\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.891680 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b5cc4ba-045f-4c69-88c9-763af824474d-metrics-tls\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.892140 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.892479 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b163a77e-e04e-46ab-ac6c-bfadbe238992-config\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.892716 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/badc8c22-a530-44b6-9dfc-9fba07b633e8-metrics-tls\") pod \"dns-operator-744455d44c-hwdhh\" (UID: \"badc8c22-a530-44b6-9dfc-9fba07b633e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.893848 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb1052b2-d1b5-4437-b80c-0b9112d041c8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.895414 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5420df36-687d-4d73-9628-3ebe3597e733-etcd-client\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.896419 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01c9f61a-85b0-499a-9885-1a02903ef457-console-oauth-config\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.899798 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5095491b-50cd-4181-97ac-da46cc4608ca-serving-cert\") pod \"openshift-config-operator-7777fb866f-fl9kc\" (UID: \"5095491b-50cd-4181-97ac-da46cc4608ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.909203 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.928750 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.941370 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5420df36-687d-4d73-9628-3ebe3597e733-serving-cert\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.949112 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.969170 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 01 19:36:28 crc kubenswrapper[4961]: I0201 19:36:28.989070 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.002520 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11a86cbd-91cd-4966-a44d-423fbdb252d7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7hprk\" (UID: \"11a86cbd-91cd-4966-a44d-423fbdb252d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.009508 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.017941 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11a86cbd-91cd-4966-a44d-423fbdb252d7-config\") pod \"kube-controller-manager-operator-78b949d7b-7hprk\" (UID: \"11a86cbd-91cd-4966-a44d-423fbdb252d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.069931 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.089627 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.109502 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.129374 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.149926 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.172002 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.189198 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.208954 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.228595 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.249689 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.268331 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.289295 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.309350 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.330457 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.349790 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.369320 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.389520 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.409614 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.429752 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.450095 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.469580 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.489464 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.509302 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.529807 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.550410 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.569692 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.589629 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.609456 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.629803 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.649343 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.669348 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.690298 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.707530 4961 request.go:700] Waited for 1.003472411s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.710320 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.729483 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.749346 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.769210 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.789223 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.819963 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.830503 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.849405 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.869186 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.889305 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.909971 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.929717 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.949131 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.969914 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 01 19:36:29 crc kubenswrapper[4961]: I0201 19:36:29.995207 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.008837 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.029308 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.049083 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.069467 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.088464 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.109374 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.129301 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.149871 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.170501 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.189893 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.209273 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.229751 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.251908 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.269091 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.292068 4961 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.308914 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.329448 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.350763 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.370505 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.390848 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.409563 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.429746 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.491343 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qgjq\" (UniqueName: \"kubernetes.io/projected/a6e8e0a4-fb78-4de0-8a48-3283e2911ac0-kube-api-access-9qgjq\") pod \"downloads-7954f5f757-v684w\" (UID: \"a6e8e0a4-fb78-4de0-8a48-3283e2911ac0\") " pod="openshift-console/downloads-7954f5f757-v684w" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.493029 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v27jk\" (UniqueName: \"kubernetes.io/projected/d9537efb-5e4e-414f-a0f2-7e566511b95c-kube-api-access-v27jk\") pod \"cluster-samples-operator-665b6dd947-fxmpb\" (UID: \"d9537efb-5e4e-414f-a0f2-7e566511b95c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.497270 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.505529 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl775\" (UniqueName: \"kubernetes.io/projected/ad1ae82a-4644-4f94-bc94-14605d3acfaa-kube-api-access-kl775\") pod \"apiserver-7bbb656c7d-d28vq\" (UID: \"ad1ae82a-4644-4f94-bc94-14605d3acfaa\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.525131 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-v684w" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.532565 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcxq2\" (UniqueName: \"kubernetes.io/projected/20a2004c-d84f-4be5-b205-70e27386ad37-kube-api-access-bcxq2\") pod \"oauth-openshift-558db77b4-597bm\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.550992 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.558796 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m54l\" (UniqueName: \"kubernetes.io/projected/86a1f5dd-69f3-4ba2-b48f-14aef559e6b2-kube-api-access-7m54l\") pod \"machine-api-operator-5694c8668f-68b9n\" (UID: \"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.570122 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.589755 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.610251 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.629557 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.654253 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.690763 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l5xd\" (UniqueName: \"kubernetes.io/projected/e705ec1a-dabb-4f4f-942a-66626c9c65d4-kube-api-access-2l5xd\") pod \"openshift-apiserver-operator-796bbdcf4f-898rx\" (UID: \"e705ec1a-dabb-4f4f-942a-66626c9c65d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.707742 4961 request.go:700] Waited for 1.931426698s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.714993 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.718724 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6wk4\" (UniqueName: \"kubernetes.io/projected/1bf30422-ee8e-4982-948c-99fd27354727-kube-api-access-t6wk4\") pod \"machine-approver-56656f9798-qjgjt\" (UID: \"1bf30422-ee8e-4982-948c-99fd27354727\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.719811 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.731070 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swpmp\" (UniqueName: \"kubernetes.io/projected/89fc5280-7863-4e88-bbe6-b6f82128d057-kube-api-access-swpmp\") pod \"authentication-operator-69f744f599-9cfh9\" (UID: \"89fc5280-7863-4e88-bbe6-b6f82128d057\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.748689 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bwnk\" (UniqueName: \"kubernetes.io/projected/8ed21892-8bb6-46f1-a593-16b869fb63d3-kube-api-access-9bwnk\") pod \"apiserver-76f77b778f-xppsd\" (UID: \"8ed21892-8bb6-46f1-a593-16b869fb63d3\") " pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.762810 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.770081 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gckss\" (UniqueName: \"kubernetes.io/projected/adda0b5e-49cb-4a97-a76c-0f19caddf65a-kube-api-access-gckss\") pod \"route-controller-manager-6576b87f9c-9qwhp\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.784201 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.785230 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.793767 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrmgl\" (UniqueName: \"kubernetes.io/projected/613fa395-90dd-4c22-8326-57e4dcb5f79f-kube-api-access-rrmgl\") pod \"controller-manager-879f6c89f-nbv69\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.802550 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.812561 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdgr2\" (UniqueName: \"kubernetes.io/projected/badc8c22-a530-44b6-9dfc-9fba07b633e8-kube-api-access-xdgr2\") pod \"dns-operator-744455d44c-hwdhh\" (UID: \"badc8c22-a530-44b6-9dfc-9fba07b633e8\") " pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.814387 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.846846 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11a86cbd-91cd-4966-a44d-423fbdb252d7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7hprk\" (UID: \"11a86cbd-91cd-4966-a44d-423fbdb252d7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.848923 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb"] Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.850519 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-v684w"] Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.855345 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.856859 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5ldh\" (UniqueName: \"kubernetes.io/projected/01c9f61a-85b0-499a-9885-1a02903ef457-kube-api-access-l5ldh\") pod \"console-f9d7485db-dw6q4\" (UID: \"01c9f61a-85b0-499a-9885-1a02903ef457\") " pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.871706 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ht7x\" (UniqueName: \"kubernetes.io/projected/fb1052b2-d1b5-4437-b80c-0b9112d041c8-kube-api-access-9ht7x\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.886556 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgwct\" (UniqueName: \"kubernetes.io/projected/5420df36-687d-4d73-9628-3ebe3597e733-kube-api-access-fgwct\") pod \"etcd-operator-b45778765-9w5c5\" (UID: \"5420df36-687d-4d73-9628-3ebe3597e733\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.910898 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6756v\" (UniqueName: \"kubernetes.io/projected/4b5cc4ba-045f-4c69-88c9-763af824474d-kube-api-access-6756v\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.920795 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.927559 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5mp\" (UniqueName: \"kubernetes.io/projected/b163a77e-e04e-46ab-ac6c-bfadbe238992-kube-api-access-mc5mp\") pod \"console-operator-58897d9998-pn8rk\" (UID: \"b163a77e-e04e-46ab-ac6c-bfadbe238992\") " pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.930141 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.943024 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.951259 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b5cc4ba-045f-4c69-88c9-763af824474d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ptgr7\" (UID: \"4b5cc4ba-045f-4c69-88c9-763af824474d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.982548 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb6lh\" (UniqueName: \"kubernetes.io/projected/5095491b-50cd-4181-97ac-da46cc4608ca-kube-api-access-nb6lh\") pod \"openshift-config-operator-7777fb866f-fl9kc\" (UID: \"5095491b-50cd-4181-97ac-da46cc4608ca\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.984054 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.991770 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fb1052b2-d1b5-4437-b80c-0b9112d041c8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9468p\" (UID: \"fb1052b2-d1b5-4437-b80c-0b9112d041c8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:30 crc kubenswrapper[4961]: I0201 19:36:30.996141 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.022891 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-certificates\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.022952 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5723f9af-aa2e-4113-9e47-90f01ea33631-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.022995 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-tls\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.023025 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm9zj\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-kube-api-access-pm9zj\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.023682 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-trusted-ca\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.023766 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5723f9af-aa2e-4113-9e47-90f01ea33631-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.023830 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.023848 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-bound-sa-token\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.024227 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:31.524207806 +0000 UTC m=+146.049239339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.056333 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-v684w" event={"ID":"a6e8e0a4-fb78-4de0-8a48-3283e2911ac0","Type":"ContainerStarted","Data":"e158a922d0897e2c74db26cf2c99e3a4eb6a7aa8dbbb3548769bdf85ed30c36c"} Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.079383 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" event={"ID":"ad1ae82a-4644-4f94-bc94-14605d3acfaa","Type":"ContainerStarted","Data":"85745fb195c96d7777dfafc3317c90de4466ca04e31c5d6eb164f2ca29666622"} Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.081742 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" event={"ID":"d9537efb-5e4e-414f-a0f2-7e566511b95c","Type":"ContainerStarted","Data":"29190c0fe67c4539ffe09d718017446d0c9e3fee3804791cd701830129e2b728"} Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.092021 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" event={"ID":"1bf30422-ee8e-4982-948c-99fd27354727","Type":"ContainerStarted","Data":"ae488443655662830a92520bf539f77be3c0f521878fd33c67d5b0aa0e750e01"} Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.124469 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.124622 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:31.624593987 +0000 UTC m=+146.149625410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.124732 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26025115-e843-444a-9ab0-78f0a8e2f0cb-config-volume\") pod \"dns-default-wgjjb\" (UID: \"26025115-e843-444a-9ab0-78f0a8e2f0cb\") " pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.124767 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/da8e51d5-7a7d-436d-ab17-ac15406f4e03-stats-auth\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.124787 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlnwr\" (UniqueName: \"kubernetes.io/projected/a61e77b8-2380-46cc-bca0-4ba53bd9b82d-kube-api-access-nlnwr\") pod \"olm-operator-6b444d44fb-7q6tm\" (UID: \"a61e77b8-2380-46cc-bca0-4ba53bd9b82d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.124805 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8361af84-73d3-4429-8158-929d70adc1c4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v454t\" (UID: \"8361af84-73d3-4429-8158-929d70adc1c4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.124846 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/163bcc7a-64d7-4275-a200-f41cc028e6fd-apiservice-cert\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.124863 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtwbv\" (UniqueName: \"kubernetes.io/projected/26025115-e843-444a-9ab0-78f0a8e2f0cb-kube-api-access-qtwbv\") pod \"dns-default-wgjjb\" (UID: \"26025115-e843-444a-9ab0-78f0a8e2f0cb\") " pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.124943 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwm7s\" (UniqueName: \"kubernetes.io/projected/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-kube-api-access-vwm7s\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.124978 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a61e77b8-2380-46cc-bca0-4ba53bd9b82d-srv-cert\") pod \"olm-operator-6b444d44fb-7q6tm\" (UID: \"a61e77b8-2380-46cc-bca0-4ba53bd9b82d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.124996 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxgv4\" (UniqueName: \"kubernetes.io/projected/8b696447-c20a-4474-b3ed-2286d1e258fb-kube-api-access-mxgv4\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2kt7\" (UID: \"8b696447-c20a-4474-b3ed-2286d1e258fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125088 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125107 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a61e77b8-2380-46cc-bca0-4ba53bd9b82d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7q6tm\" (UID: \"a61e77b8-2380-46cc-bca0-4ba53bd9b82d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125140 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvzgg\" (UniqueName: \"kubernetes.io/projected/12553b4f-8c6c-4df6-ae4e-62af8a7354ce-kube-api-access-wvzgg\") pod \"service-ca-operator-777779d784-zdws9\" (UID: \"12553b4f-8c6c-4df6-ae4e-62af8a7354ce\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125157 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbjls\" (UniqueName: \"kubernetes.io/projected/71b2ce56-de70-4686-b2d6-9c0813626125-kube-api-access-jbjls\") pod \"catalog-operator-68c6474976-wfpd6\" (UID: \"71b2ce56-de70-4686-b2d6-9c0813626125\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125180 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be03c00f-3bf0-4698-afe4-201249fe36a7-secret-volume\") pod \"collect-profiles-29499570-lnpdj\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125243 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-certificates\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125276 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125314 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk475\" (UniqueName: \"kubernetes.io/projected/d85040db-48ad-439c-aed0-9152f78666bf-kube-api-access-dk475\") pod \"openshift-controller-manager-operator-756b6f6bc6-5fwwn\" (UID: \"d85040db-48ad-439c-aed0-9152f78666bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125330 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxtq8\" (UniqueName: \"kubernetes.io/projected/9ab40180-9214-4fb8-a267-1ff710ad04e4-kube-api-access-zxtq8\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125348 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5723f9af-aa2e-4113-9e47-90f01ea33631-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125369 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rk9ts\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125387 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d85040db-48ad-439c-aed0-9152f78666bf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5fwwn\" (UID: \"d85040db-48ad-439c-aed0-9152f78666bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125426 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5wfx\" (UniqueName: \"kubernetes.io/projected/7431686b-4741-4c3d-852c-45e2cb99de38-kube-api-access-m5wfx\") pod \"package-server-manager-789f6589d5-s6pw4\" (UID: \"7431686b-4741-4c3d-852c-45e2cb99de38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125462 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72e4afaf-4ac5-4c16-899a-c19b7a0550cf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8n4k2\" (UID: \"72e4afaf-4ac5-4c16-899a-c19b7a0550cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125490 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-tls\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125510 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kctkk\" (UniqueName: \"kubernetes.io/projected/1a177e48-a514-480b-8315-2ad8d69c0cd4-kube-api-access-kctkk\") pod \"multus-admission-controller-857f4d67dd-gl2kg\" (UID: \"1a177e48-a514-480b-8315-2ad8d69c0cd4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125525 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da8e51d5-7a7d-436d-ab17-ac15406f4e03-metrics-certs\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125541 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkttv\" (UniqueName: \"kubernetes.io/projected/30c54837-57a2-47d2-b2f0-e4c17a056b9e-kube-api-access-kkttv\") pod \"migrator-59844c95c7-dzhm7\" (UID: \"30c54837-57a2-47d2-b2f0-e4c17a056b9e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125597 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/547d4fdd-999f-4578-9efc-38155da0b2fe-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xzdqx\" (UID: \"547d4fdd-999f-4578-9efc-38155da0b2fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125613 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-images\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125642 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm9zj\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-kube-api-access-pm9zj\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125659 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ac9b2b1-423f-444c-9274-d0fdb22f2d27-signing-key\") pod \"service-ca-9c57cc56f-mjk8q\" (UID: \"0ac9b2b1-423f-444c-9274-d0fdb22f2d27\") " pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125711 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcj9d\" (UniqueName: \"kubernetes.io/projected/ec361338-d606-471f-acfe-9f94619f21f3-kube-api-access-xcj9d\") pod \"machine-config-server-lczb5\" (UID: \"ec361338-d606-471f-acfe-9f94619f21f3\") " pod="openshift-machine-config-operator/machine-config-server-lczb5" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125747 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ec361338-d606-471f-acfe-9f94619f21f3-node-bootstrap-token\") pod \"machine-config-server-lczb5\" (UID: \"ec361338-d606-471f-acfe-9f94619f21f3\") " pod="openshift-machine-config-operator/machine-config-server-lczb5" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125764 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be03c00f-3bf0-4698-afe4-201249fe36a7-config-volume\") pod \"collect-profiles-29499570-lnpdj\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125804 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-trusted-ca\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125821 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1a177e48-a514-480b-8315-2ad8d69c0cd4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gl2kg\" (UID: \"1a177e48-a514-480b-8315-2ad8d69c0cd4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125854 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/71b2ce56-de70-4686-b2d6-9c0813626125-profile-collector-cert\") pod \"catalog-operator-68c6474976-wfpd6\" (UID: \"71b2ce56-de70-4686-b2d6-9c0813626125\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125871 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/163bcc7a-64d7-4275-a200-f41cc028e6fd-webhook-cert\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125933 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12553b4f-8c6c-4df6-ae4e-62af8a7354ce-serving-cert\") pod \"service-ca-operator-777779d784-zdws9\" (UID: \"12553b4f-8c6c-4df6-ae4e-62af8a7354ce\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125948 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjcf9\" (UniqueName: \"kubernetes.io/projected/0ac9b2b1-423f-444c-9274-d0fdb22f2d27-kube-api-access-cjcf9\") pod \"service-ca-9c57cc56f-mjk8q\" (UID: \"0ac9b2b1-423f-444c-9274-d0fdb22f2d27\") " pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125981 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-proxy-tls\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.125998 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-bound-sa-token\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126013 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ntsh\" (UniqueName: \"kubernetes.io/projected/547d4fdd-999f-4578-9efc-38155da0b2fe-kube-api-access-6ntsh\") pod \"control-plane-machine-set-operator-78cbb6b69f-xzdqx\" (UID: \"547d4fdd-999f-4578-9efc-38155da0b2fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126030 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b696447-c20a-4474-b3ed-2286d1e258fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2kt7\" (UID: \"8b696447-c20a-4474-b3ed-2286d1e258fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126106 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72e4afaf-4ac5-4c16-899a-c19b7a0550cf-config\") pod \"kube-apiserver-operator-766d6c64bb-8n4k2\" (UID: \"72e4afaf-4ac5-4c16-899a-c19b7a0550cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126149 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74904be9-13e0-4bd0-9bf1-b0490e87aaaa-cert\") pod \"ingress-canary-lc945\" (UID: \"74904be9-13e0-4bd0-9bf1-b0490e87aaaa\") " pod="openshift-ingress-canary/ingress-canary-lc945" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126166 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d85040db-48ad-439c-aed0-9152f78666bf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5fwwn\" (UID: \"d85040db-48ad-439c-aed0-9152f78666bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126222 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgxrv\" (UniqueName: \"kubernetes.io/projected/6198133a-f3a2-4b73-b4f7-103c5da5e684-kube-api-access-dgxrv\") pod \"marketplace-operator-79b997595-rk9ts\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126243 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fa93ea5-1dca-4474-a0a7-be6ae2807e68-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mbtkv\" (UID: \"1fa93ea5-1dca-4474-a0a7-be6ae2807e68\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126263 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt5tq\" (UniqueName: \"kubernetes.io/projected/163bcc7a-64d7-4275-a200-f41cc028e6fd-kube-api-access-lt5tq\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126299 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsrc5\" (UniqueName: \"kubernetes.io/projected/74904be9-13e0-4bd0-9bf1-b0490e87aaaa-kube-api-access-hsrc5\") pod \"ingress-canary-lc945\" (UID: \"74904be9-13e0-4bd0-9bf1-b0490e87aaaa\") " pod="openshift-ingress-canary/ingress-canary-lc945" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126315 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ac9b2b1-423f-444c-9274-d0fdb22f2d27-signing-cabundle\") pod \"service-ca-9c57cc56f-mjk8q\" (UID: \"0ac9b2b1-423f-444c-9274-d0fdb22f2d27\") " pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126364 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9vhk\" (UniqueName: \"kubernetes.io/projected/1fa93ea5-1dca-4474-a0a7-be6ae2807e68-kube-api-access-s9vhk\") pod \"machine-config-controller-84d6567774-mbtkv\" (UID: \"1fa93ea5-1dca-4474-a0a7-be6ae2807e68\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126390 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-mountpoint-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126420 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8361af84-73d3-4429-8158-929d70adc1c4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v454t\" (UID: \"8361af84-73d3-4429-8158-929d70adc1c4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126435 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72e4afaf-4ac5-4c16-899a-c19b7a0550cf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8n4k2\" (UID: \"72e4afaf-4ac5-4c16-899a-c19b7a0550cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126490 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26025115-e843-444a-9ab0-78f0a8e2f0cb-metrics-tls\") pod \"dns-default-wgjjb\" (UID: \"26025115-e843-444a-9ab0-78f0a8e2f0cb\") " pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126529 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7ksr\" (UniqueName: \"kubernetes.io/projected/be03c00f-3bf0-4698-afe4-201249fe36a7-kube-api-access-h7ksr\") pod \"collect-profiles-29499570-lnpdj\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126544 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-plugins-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126558 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-csi-data-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126576 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/71b2ce56-de70-4686-b2d6-9c0813626125-srv-cert\") pod \"catalog-operator-68c6474976-wfpd6\" (UID: \"71b2ce56-de70-4686-b2d6-9c0813626125\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126613 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-socket-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126655 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/da8e51d5-7a7d-436d-ab17-ac15406f4e03-default-certificate\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126692 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cz9n\" (UniqueName: \"kubernetes.io/projected/da8e51d5-7a7d-436d-ab17-ac15406f4e03-kube-api-access-9cz9n\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126709 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12553b4f-8c6c-4df6-ae4e-62af8a7354ce-config\") pod \"service-ca-operator-777779d784-zdws9\" (UID: \"12553b4f-8c6c-4df6-ae4e-62af8a7354ce\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126758 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7431686b-4741-4c3d-852c-45e2cb99de38-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s6pw4\" (UID: \"7431686b-4741-4c3d-852c-45e2cb99de38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126933 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da8e51d5-7a7d-436d-ab17-ac15406f4e03-service-ca-bundle\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.126989 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b696447-c20a-4474-b3ed-2286d1e258fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2kt7\" (UID: \"8b696447-c20a-4474-b3ed-2286d1e258fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.127025 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8361af84-73d3-4429-8158-929d70adc1c4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v454t\" (UID: \"8361af84-73d3-4429-8158-929d70adc1c4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.127113 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ec361338-d606-471f-acfe-9f94619f21f3-certs\") pod \"machine-config-server-lczb5\" (UID: \"ec361338-d606-471f-acfe-9f94619f21f3\") " pod="openshift-machine-config-operator/machine-config-server-lczb5" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.127199 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rk9ts\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.128953 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/163bcc7a-64d7-4275-a200-f41cc028e6fd-tmpfs\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.129083 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5723f9af-aa2e-4113-9e47-90f01ea33631-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.129116 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fa93ea5-1dca-4474-a0a7-be6ae2807e68-proxy-tls\") pod \"machine-config-controller-84d6567774-mbtkv\" (UID: \"1fa93ea5-1dca-4474-a0a7-be6ae2807e68\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.129140 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-registration-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.129458 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5723f9af-aa2e-4113-9e47-90f01ea33631-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.142309 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-certificates\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.143166 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.143565 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.143904 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-tls\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.144267 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.151711 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:31.651680606 +0000 UTC m=+146.176712029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.153005 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.156930 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5723f9af-aa2e-4113-9e47-90f01ea33631-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.163446 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.179957 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-trusted-ca\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.205313 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm9zj\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-kube-api-access-pm9zj\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.210292 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-bound-sa-token\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230554 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.230652 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:31.730636437 +0000 UTC m=+146.255667850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230755 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsrc5\" (UniqueName: \"kubernetes.io/projected/74904be9-13e0-4bd0-9bf1-b0490e87aaaa-kube-api-access-hsrc5\") pod \"ingress-canary-lc945\" (UID: \"74904be9-13e0-4bd0-9bf1-b0490e87aaaa\") " pod="openshift-ingress-canary/ingress-canary-lc945" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230776 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ac9b2b1-423f-444c-9274-d0fdb22f2d27-signing-cabundle\") pod \"service-ca-9c57cc56f-mjk8q\" (UID: \"0ac9b2b1-423f-444c-9274-d0fdb22f2d27\") " pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230799 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9vhk\" (UniqueName: \"kubernetes.io/projected/1fa93ea5-1dca-4474-a0a7-be6ae2807e68-kube-api-access-s9vhk\") pod \"machine-config-controller-84d6567774-mbtkv\" (UID: \"1fa93ea5-1dca-4474-a0a7-be6ae2807e68\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230820 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-mountpoint-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230847 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8361af84-73d3-4429-8158-929d70adc1c4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v454t\" (UID: \"8361af84-73d3-4429-8158-929d70adc1c4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230861 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72e4afaf-4ac5-4c16-899a-c19b7a0550cf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8n4k2\" (UID: \"72e4afaf-4ac5-4c16-899a-c19b7a0550cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230886 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26025115-e843-444a-9ab0-78f0a8e2f0cb-metrics-tls\") pod \"dns-default-wgjjb\" (UID: \"26025115-e843-444a-9ab0-78f0a8e2f0cb\") " pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230923 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7ksr\" (UniqueName: \"kubernetes.io/projected/be03c00f-3bf0-4698-afe4-201249fe36a7-kube-api-access-h7ksr\") pod \"collect-profiles-29499570-lnpdj\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230938 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-plugins-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230952 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-csi-data-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230968 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/71b2ce56-de70-4686-b2d6-9c0813626125-srv-cert\") pod \"catalog-operator-68c6474976-wfpd6\" (UID: \"71b2ce56-de70-4686-b2d6-9c0813626125\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.230983 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-socket-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231008 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/da8e51d5-7a7d-436d-ab17-ac15406f4e03-default-certificate\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231025 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cz9n\" (UniqueName: \"kubernetes.io/projected/da8e51d5-7a7d-436d-ab17-ac15406f4e03-kube-api-access-9cz9n\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231053 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12553b4f-8c6c-4df6-ae4e-62af8a7354ce-config\") pod \"service-ca-operator-777779d784-zdws9\" (UID: \"12553b4f-8c6c-4df6-ae4e-62af8a7354ce\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231069 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7431686b-4741-4c3d-852c-45e2cb99de38-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s6pw4\" (UID: \"7431686b-4741-4c3d-852c-45e2cb99de38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231084 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b696447-c20a-4474-b3ed-2286d1e258fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2kt7\" (UID: \"8b696447-c20a-4474-b3ed-2286d1e258fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231100 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da8e51d5-7a7d-436d-ab17-ac15406f4e03-service-ca-bundle\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231114 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ec361338-d606-471f-acfe-9f94619f21f3-certs\") pod \"machine-config-server-lczb5\" (UID: \"ec361338-d606-471f-acfe-9f94619f21f3\") " pod="openshift-machine-config-operator/machine-config-server-lczb5" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231130 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8361af84-73d3-4429-8158-929d70adc1c4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v454t\" (UID: \"8361af84-73d3-4429-8158-929d70adc1c4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231147 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rk9ts\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231162 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/163bcc7a-64d7-4275-a200-f41cc028e6fd-tmpfs\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231181 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fa93ea5-1dca-4474-a0a7-be6ae2807e68-proxy-tls\") pod \"machine-config-controller-84d6567774-mbtkv\" (UID: \"1fa93ea5-1dca-4474-a0a7-be6ae2807e68\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231196 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-registration-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231214 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26025115-e843-444a-9ab0-78f0a8e2f0cb-config-volume\") pod \"dns-default-wgjjb\" (UID: \"26025115-e843-444a-9ab0-78f0a8e2f0cb\") " pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231240 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/da8e51d5-7a7d-436d-ab17-ac15406f4e03-stats-auth\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231256 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlnwr\" (UniqueName: \"kubernetes.io/projected/a61e77b8-2380-46cc-bca0-4ba53bd9b82d-kube-api-access-nlnwr\") pod \"olm-operator-6b444d44fb-7q6tm\" (UID: \"a61e77b8-2380-46cc-bca0-4ba53bd9b82d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231270 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8361af84-73d3-4429-8158-929d70adc1c4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v454t\" (UID: \"8361af84-73d3-4429-8158-929d70adc1c4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231287 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/163bcc7a-64d7-4275-a200-f41cc028e6fd-apiservice-cert\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231300 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtwbv\" (UniqueName: \"kubernetes.io/projected/26025115-e843-444a-9ab0-78f0a8e2f0cb-kube-api-access-qtwbv\") pod \"dns-default-wgjjb\" (UID: \"26025115-e843-444a-9ab0-78f0a8e2f0cb\") " pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231315 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwm7s\" (UniqueName: \"kubernetes.io/projected/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-kube-api-access-vwm7s\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231332 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a61e77b8-2380-46cc-bca0-4ba53bd9b82d-srv-cert\") pod \"olm-operator-6b444d44fb-7q6tm\" (UID: \"a61e77b8-2380-46cc-bca0-4ba53bd9b82d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231362 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxgv4\" (UniqueName: \"kubernetes.io/projected/8b696447-c20a-4474-b3ed-2286d1e258fb-kube-api-access-mxgv4\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2kt7\" (UID: \"8b696447-c20a-4474-b3ed-2286d1e258fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231383 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231403 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a61e77b8-2380-46cc-bca0-4ba53bd9b82d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7q6tm\" (UID: \"a61e77b8-2380-46cc-bca0-4ba53bd9b82d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231423 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvzgg\" (UniqueName: \"kubernetes.io/projected/12553b4f-8c6c-4df6-ae4e-62af8a7354ce-kube-api-access-wvzgg\") pod \"service-ca-operator-777779d784-zdws9\" (UID: \"12553b4f-8c6c-4df6-ae4e-62af8a7354ce\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231439 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbjls\" (UniqueName: \"kubernetes.io/projected/71b2ce56-de70-4686-b2d6-9c0813626125-kube-api-access-jbjls\") pod \"catalog-operator-68c6474976-wfpd6\" (UID: \"71b2ce56-de70-4686-b2d6-9c0813626125\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231459 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be03c00f-3bf0-4698-afe4-201249fe36a7-secret-volume\") pod \"collect-profiles-29499570-lnpdj\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231475 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231494 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk475\" (UniqueName: \"kubernetes.io/projected/d85040db-48ad-439c-aed0-9152f78666bf-kube-api-access-dk475\") pod \"openshift-controller-manager-operator-756b6f6bc6-5fwwn\" (UID: \"d85040db-48ad-439c-aed0-9152f78666bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231511 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxtq8\" (UniqueName: \"kubernetes.io/projected/9ab40180-9214-4fb8-a267-1ff710ad04e4-kube-api-access-zxtq8\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231533 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rk9ts\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231547 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d85040db-48ad-439c-aed0-9152f78666bf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5fwwn\" (UID: \"d85040db-48ad-439c-aed0-9152f78666bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231567 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5wfx\" (UniqueName: \"kubernetes.io/projected/7431686b-4741-4c3d-852c-45e2cb99de38-kube-api-access-m5wfx\") pod \"package-server-manager-789f6589d5-s6pw4\" (UID: \"7431686b-4741-4c3d-852c-45e2cb99de38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231585 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72e4afaf-4ac5-4c16-899a-c19b7a0550cf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8n4k2\" (UID: \"72e4afaf-4ac5-4c16-899a-c19b7a0550cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231602 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kctkk\" (UniqueName: \"kubernetes.io/projected/1a177e48-a514-480b-8315-2ad8d69c0cd4-kube-api-access-kctkk\") pod \"multus-admission-controller-857f4d67dd-gl2kg\" (UID: \"1a177e48-a514-480b-8315-2ad8d69c0cd4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231617 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da8e51d5-7a7d-436d-ab17-ac15406f4e03-metrics-certs\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231631 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkttv\" (UniqueName: \"kubernetes.io/projected/30c54837-57a2-47d2-b2f0-e4c17a056b9e-kube-api-access-kkttv\") pod \"migrator-59844c95c7-dzhm7\" (UID: \"30c54837-57a2-47d2-b2f0-e4c17a056b9e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231650 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/547d4fdd-999f-4578-9efc-38155da0b2fe-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xzdqx\" (UID: \"547d4fdd-999f-4578-9efc-38155da0b2fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231665 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-images\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231680 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ac9b2b1-423f-444c-9274-d0fdb22f2d27-signing-key\") pod \"service-ca-9c57cc56f-mjk8q\" (UID: \"0ac9b2b1-423f-444c-9274-d0fdb22f2d27\") " pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231696 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcj9d\" (UniqueName: \"kubernetes.io/projected/ec361338-d606-471f-acfe-9f94619f21f3-kube-api-access-xcj9d\") pod \"machine-config-server-lczb5\" (UID: \"ec361338-d606-471f-acfe-9f94619f21f3\") " pod="openshift-machine-config-operator/machine-config-server-lczb5" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231718 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be03c00f-3bf0-4698-afe4-201249fe36a7-config-volume\") pod \"collect-profiles-29499570-lnpdj\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231734 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ec361338-d606-471f-acfe-9f94619f21f3-node-bootstrap-token\") pod \"machine-config-server-lczb5\" (UID: \"ec361338-d606-471f-acfe-9f94619f21f3\") " pod="openshift-machine-config-operator/machine-config-server-lczb5" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231749 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1a177e48-a514-480b-8315-2ad8d69c0cd4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gl2kg\" (UID: \"1a177e48-a514-480b-8315-2ad8d69c0cd4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231774 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/71b2ce56-de70-4686-b2d6-9c0813626125-profile-collector-cert\") pod \"catalog-operator-68c6474976-wfpd6\" (UID: \"71b2ce56-de70-4686-b2d6-9c0813626125\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231792 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/163bcc7a-64d7-4275-a200-f41cc028e6fd-webhook-cert\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231806 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjcf9\" (UniqueName: \"kubernetes.io/projected/0ac9b2b1-423f-444c-9274-d0fdb22f2d27-kube-api-access-cjcf9\") pod \"service-ca-9c57cc56f-mjk8q\" (UID: \"0ac9b2b1-423f-444c-9274-d0fdb22f2d27\") " pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231821 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12553b4f-8c6c-4df6-ae4e-62af8a7354ce-serving-cert\") pod \"service-ca-operator-777779d784-zdws9\" (UID: \"12553b4f-8c6c-4df6-ae4e-62af8a7354ce\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231836 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ntsh\" (UniqueName: \"kubernetes.io/projected/547d4fdd-999f-4578-9efc-38155da0b2fe-kube-api-access-6ntsh\") pod \"control-plane-machine-set-operator-78cbb6b69f-xzdqx\" (UID: \"547d4fdd-999f-4578-9efc-38155da0b2fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231855 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-proxy-tls\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231870 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b696447-c20a-4474-b3ed-2286d1e258fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2kt7\" (UID: \"8b696447-c20a-4474-b3ed-2286d1e258fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231886 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72e4afaf-4ac5-4c16-899a-c19b7a0550cf-config\") pod \"kube-apiserver-operator-766d6c64bb-8n4k2\" (UID: \"72e4afaf-4ac5-4c16-899a-c19b7a0550cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231901 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74904be9-13e0-4bd0-9bf1-b0490e87aaaa-cert\") pod \"ingress-canary-lc945\" (UID: \"74904be9-13e0-4bd0-9bf1-b0490e87aaaa\") " pod="openshift-ingress-canary/ingress-canary-lc945" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231916 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d85040db-48ad-439c-aed0-9152f78666bf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5fwwn\" (UID: \"d85040db-48ad-439c-aed0-9152f78666bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231932 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgxrv\" (UniqueName: \"kubernetes.io/projected/6198133a-f3a2-4b73-b4f7-103c5da5e684-kube-api-access-dgxrv\") pod \"marketplace-operator-79b997595-rk9ts\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231950 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fa93ea5-1dca-4474-a0a7-be6ae2807e68-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mbtkv\" (UID: \"1fa93ea5-1dca-4474-a0a7-be6ae2807e68\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.231972 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5tq\" (UniqueName: \"kubernetes.io/projected/163bcc7a-64d7-4275-a200-f41cc028e6fd-kube-api-access-lt5tq\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.232501 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26025115-e843-444a-9ab0-78f0a8e2f0cb-config-volume\") pod \"dns-default-wgjjb\" (UID: \"26025115-e843-444a-9ab0-78f0a8e2f0cb\") " pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.233330 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12553b4f-8c6c-4df6-ae4e-62af8a7354ce-config\") pod \"service-ca-operator-777779d784-zdws9\" (UID: \"12553b4f-8c6c-4df6-ae4e-62af8a7354ce\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.237178 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7431686b-4741-4c3d-852c-45e2cb99de38-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s6pw4\" (UID: \"7431686b-4741-4c3d-852c-45e2cb99de38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.239975 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/163bcc7a-64d7-4275-a200-f41cc028e6fd-tmpfs\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.240622 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/da8e51d5-7a7d-436d-ab17-ac15406f4e03-default-certificate\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.241302 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-mountpoint-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.242160 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ac9b2b1-423f-444c-9274-d0fdb22f2d27-signing-cabundle\") pod \"service-ca-9c57cc56f-mjk8q\" (UID: \"0ac9b2b1-423f-444c-9274-d0fdb22f2d27\") " pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.242435 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-plugins-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.242509 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-csi-data-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.242806 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-registration-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.245624 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8361af84-73d3-4429-8158-929d70adc1c4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v454t\" (UID: \"8361af84-73d3-4429-8158-929d70adc1c4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.245769 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be03c00f-3bf0-4698-afe4-201249fe36a7-config-volume\") pod \"collect-profiles-29499570-lnpdj\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.247419 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-images\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.249142 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b696447-c20a-4474-b3ed-2286d1e258fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2kt7\" (UID: \"8b696447-c20a-4474-b3ed-2286d1e258fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.250386 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da8e51d5-7a7d-436d-ab17-ac15406f4e03-service-ca-bundle\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.252816 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:31.752790857 +0000 UTC m=+146.277822490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.256324 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9ab40180-9214-4fb8-a267-1ff710ad04e4-socket-dir\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.257357 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/da8e51d5-7a7d-436d-ab17-ac15406f4e03-stats-auth\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.257937 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/71b2ce56-de70-4686-b2d6-9c0813626125-profile-collector-cert\") pod \"catalog-operator-68c6474976-wfpd6\" (UID: \"71b2ce56-de70-4686-b2d6-9c0813626125\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.259140 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a61e77b8-2380-46cc-bca0-4ba53bd9b82d-srv-cert\") pod \"olm-operator-6b444d44fb-7q6tm\" (UID: \"a61e77b8-2380-46cc-bca0-4ba53bd9b82d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.260058 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rk9ts\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.261369 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-auth-proxy-config\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.264280 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a61e77b8-2380-46cc-bca0-4ba53bd9b82d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-7q6tm\" (UID: \"a61e77b8-2380-46cc-bca0-4ba53bd9b82d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.265798 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be03c00f-3bf0-4698-afe4-201249fe36a7-secret-volume\") pod \"collect-profiles-29499570-lnpdj\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.265969 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-proxy-tls\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.266088 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/163bcc7a-64d7-4275-a200-f41cc028e6fd-apiservice-cert\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.266181 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8361af84-73d3-4429-8158-929d70adc1c4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v454t\" (UID: \"8361af84-73d3-4429-8158-929d70adc1c4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.266582 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.267327 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fa93ea5-1dca-4474-a0a7-be6ae2807e68-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mbtkv\" (UID: \"1fa93ea5-1dca-4474-a0a7-be6ae2807e68\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.268445 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72e4afaf-4ac5-4c16-899a-c19b7a0550cf-config\") pod \"kube-apiserver-operator-766d6c64bb-8n4k2\" (UID: \"72e4afaf-4ac5-4c16-899a-c19b7a0550cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.269108 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d85040db-48ad-439c-aed0-9152f78666bf-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5fwwn\" (UID: \"d85040db-48ad-439c-aed0-9152f78666bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.271286 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1a177e48-a514-480b-8315-2ad8d69c0cd4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gl2kg\" (UID: \"1a177e48-a514-480b-8315-2ad8d69c0cd4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.271615 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/547d4fdd-999f-4578-9efc-38155da0b2fe-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xzdqx\" (UID: \"547d4fdd-999f-4578-9efc-38155da0b2fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.272999 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26025115-e843-444a-9ab0-78f0a8e2f0cb-metrics-tls\") pod \"dns-default-wgjjb\" (UID: \"26025115-e843-444a-9ab0-78f0a8e2f0cb\") " pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.273108 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74904be9-13e0-4bd0-9bf1-b0490e87aaaa-cert\") pod \"ingress-canary-lc945\" (UID: \"74904be9-13e0-4bd0-9bf1-b0490e87aaaa\") " pod="openshift-ingress-canary/ingress-canary-lc945" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.273436 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/71b2ce56-de70-4686-b2d6-9c0813626125-srv-cert\") pod \"catalog-operator-68c6474976-wfpd6\" (UID: \"71b2ce56-de70-4686-b2d6-9c0813626125\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.273787 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72e4afaf-4ac5-4c16-899a-c19b7a0550cf-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8n4k2\" (UID: \"72e4afaf-4ac5-4c16-899a-c19b7a0550cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.274251 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ec361338-d606-471f-acfe-9f94619f21f3-node-bootstrap-token\") pod \"machine-config-server-lczb5\" (UID: \"ec361338-d606-471f-acfe-9f94619f21f3\") " pod="openshift-machine-config-operator/machine-config-server-lczb5" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.274409 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12553b4f-8c6c-4df6-ae4e-62af8a7354ce-serving-cert\") pod \"service-ca-operator-777779d784-zdws9\" (UID: \"12553b4f-8c6c-4df6-ae4e-62af8a7354ce\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.278189 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ac9b2b1-423f-444c-9274-d0fdb22f2d27-signing-key\") pod \"service-ca-9c57cc56f-mjk8q\" (UID: \"0ac9b2b1-423f-444c-9274-d0fdb22f2d27\") " pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.278914 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d85040db-48ad-439c-aed0-9152f78666bf-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5fwwn\" (UID: \"d85040db-48ad-439c-aed0-9152f78666bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.280918 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt5tq\" (UniqueName: \"kubernetes.io/projected/163bcc7a-64d7-4275-a200-f41cc028e6fd-kube-api-access-lt5tq\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.282874 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fa93ea5-1dca-4474-a0a7-be6ae2807e68-proxy-tls\") pod \"machine-config-controller-84d6567774-mbtkv\" (UID: \"1fa93ea5-1dca-4474-a0a7-be6ae2807e68\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.282887 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rk9ts\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.283415 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/163bcc7a-64d7-4275-a200-f41cc028e6fd-webhook-cert\") pod \"packageserver-d55dfcdfc-pkfkf\" (UID: \"163bcc7a-64d7-4275-a200-f41cc028e6fd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.284789 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b696447-c20a-4474-b3ed-2286d1e258fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2kt7\" (UID: \"8b696447-c20a-4474-b3ed-2286d1e258fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.285138 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da8e51d5-7a7d-436d-ab17-ac15406f4e03-metrics-certs\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.298190 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-68b9n"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.304184 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ec361338-d606-471f-acfe-9f94619f21f3-certs\") pod \"machine-config-server-lczb5\" (UID: \"ec361338-d606-471f-acfe-9f94619f21f3\") " pod="openshift-machine-config-operator/machine-config-server-lczb5" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.314067 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsrc5\" (UniqueName: \"kubernetes.io/projected/74904be9-13e0-4bd0-9bf1-b0490e87aaaa-kube-api-access-hsrc5\") pod \"ingress-canary-lc945\" (UID: \"74904be9-13e0-4bd0-9bf1-b0490e87aaaa\") " pod="openshift-ingress-canary/ingress-canary-lc945" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.325013 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcj9d\" (UniqueName: \"kubernetes.io/projected/ec361338-d606-471f-acfe-9f94619f21f3-kube-api-access-xcj9d\") pod \"machine-config-server-lczb5\" (UID: \"ec361338-d606-471f-acfe-9f94619f21f3\") " pod="openshift-machine-config-operator/machine-config-server-lczb5" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.326781 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cz9n\" (UniqueName: \"kubernetes.io/projected/da8e51d5-7a7d-436d-ab17-ac15406f4e03-kube-api-access-9cz9n\") pod \"router-default-5444994796-tjxcx\" (UID: \"da8e51d5-7a7d-436d-ab17-ac15406f4e03\") " pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.333267 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.333766 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:31.833752415 +0000 UTC m=+146.358783828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.342167 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.356755 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkttv\" (UniqueName: \"kubernetes.io/projected/30c54837-57a2-47d2-b2f0-e4c17a056b9e-kube-api-access-kkttv\") pod \"migrator-59844c95c7-dzhm7\" (UID: \"30c54837-57a2-47d2-b2f0-e4c17a056b9e\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.359425 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.367144 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hwdhh"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.377053 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlnwr\" (UniqueName: \"kubernetes.io/projected/a61e77b8-2380-46cc-bca0-4ba53bd9b82d-kube-api-access-nlnwr\") pod \"olm-operator-6b444d44fb-7q6tm\" (UID: \"a61e77b8-2380-46cc-bca0-4ba53bd9b82d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.383770 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7ksr\" (UniqueName: \"kubernetes.io/projected/be03c00f-3bf0-4698-afe4-201249fe36a7-kube-api-access-h7ksr\") pod \"collect-profiles-29499570-lnpdj\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.390447 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.397596 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lc945" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.399456 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nbv69"] Feb 01 19:36:31 crc kubenswrapper[4961]: W0201 19:36:31.400659 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86a1f5dd_69f3_4ba2_b48f_14aef559e6b2.slice/crio-43a89195753364d69c5ba91a44f1b3167c1dcc87bd4a97c87b529b1240586ddc WatchSource:0}: Error finding container 43a89195753364d69c5ba91a44f1b3167c1dcc87bd4a97c87b529b1240586ddc: Status 404 returned error can't find the container with id 43a89195753364d69c5ba91a44f1b3167c1dcc87bd4a97c87b529b1240586ddc Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.407449 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lczb5" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.421495 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9vhk\" (UniqueName: \"kubernetes.io/projected/1fa93ea5-1dca-4474-a0a7-be6ae2807e68-kube-api-access-s9vhk\") pod \"machine-config-controller-84d6567774-mbtkv\" (UID: \"1fa93ea5-1dca-4474-a0a7-be6ae2807e68\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.434880 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.435488 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:31.935462373 +0000 UTC m=+146.460493796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.443759 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9cfh9"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.447720 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-597bm"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.450569 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8361af84-73d3-4429-8158-929d70adc1c4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v454t\" (UID: \"8361af84-73d3-4429-8158-929d70adc1c4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.466749 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjcf9\" (UniqueName: \"kubernetes.io/projected/0ac9b2b1-423f-444c-9274-d0fdb22f2d27-kube-api-access-cjcf9\") pod \"service-ca-9c57cc56f-mjk8q\" (UID: \"0ac9b2b1-423f-444c-9274-d0fdb22f2d27\") " pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" Feb 01 19:36:31 crc kubenswrapper[4961]: W0201 19:36:31.477141 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadda0b5e_49cb_4a97_a76c_0f19caddf65a.slice/crio-222f5be60ca0378c8ba260383564f4215134a68a6eb6b83a3d406ac2b7edf1d9 WatchSource:0}: Error finding container 222f5be60ca0378c8ba260383564f4215134a68a6eb6b83a3d406ac2b7edf1d9: Status 404 returned error can't find the container with id 222f5be60ca0378c8ba260383564f4215134a68a6eb6b83a3d406ac2b7edf1d9 Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.494281 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72e4afaf-4ac5-4c16-899a-c19b7a0550cf-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8n4k2\" (UID: \"72e4afaf-4ac5-4c16-899a-c19b7a0550cf\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.498406 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk"] Feb 01 19:36:31 crc kubenswrapper[4961]: W0201 19:36:31.502752 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod613fa395_90dd_4c22_8326_57e4dcb5f79f.slice/crio-865a153eb7e03abe94d92983ca9c7a34434ad2f5264ea1d6ee10a8290be38442 WatchSource:0}: Error finding container 865a153eb7e03abe94d92983ca9c7a34434ad2f5264ea1d6ee10a8290be38442: Status 404 returned error can't find the container with id 865a153eb7e03abe94d92983ca9c7a34434ad2f5264ea1d6ee10a8290be38442 Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.511908 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ntsh\" (UniqueName: \"kubernetes.io/projected/547d4fdd-999f-4578-9efc-38155da0b2fe-kube-api-access-6ntsh\") pod \"control-plane-machine-set-operator-78cbb6b69f-xzdqx\" (UID: \"547d4fdd-999f-4578-9efc-38155da0b2fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.532815 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-xppsd"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.533894 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.536433 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.536581 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.036560454 +0000 UTC m=+146.561591887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.536662 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.537007 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.036995846 +0000 UTC m=+146.562027269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.540364 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9w5c5"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.547894 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxgv4\" (UniqueName: \"kubernetes.io/projected/8b696447-c20a-4474-b3ed-2286d1e258fb-kube-api-access-mxgv4\") pod \"kube-storage-version-migrator-operator-b67b599dd-x2kt7\" (UID: \"8b696447-c20a-4474-b3ed-2286d1e258fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.548532 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbjls\" (UniqueName: \"kubernetes.io/projected/71b2ce56-de70-4686-b2d6-9c0813626125-kube-api-access-jbjls\") pod \"catalog-operator-68c6474976-wfpd6\" (UID: \"71b2ce56-de70-4686-b2d6-9c0813626125\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.555564 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.559375 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.560390 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.566957 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.568250 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtwbv\" (UniqueName: \"kubernetes.io/projected/26025115-e843-444a-9ab0-78f0a8e2f0cb-kube-api-access-qtwbv\") pod \"dns-default-wgjjb\" (UID: \"26025115-e843-444a-9ab0-78f0a8e2f0cb\") " pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.574544 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.581413 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.585574 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwm7s\" (UniqueName: \"kubernetes.io/projected/6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab-kube-api-access-vwm7s\") pod \"machine-config-operator-74547568cd-nc9sp\" (UID: \"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.610070 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk475\" (UniqueName: \"kubernetes.io/projected/d85040db-48ad-439c-aed0-9152f78666bf-kube-api-access-dk475\") pod \"openshift-controller-manager-operator-756b6f6bc6-5fwwn\" (UID: \"d85040db-48ad-439c-aed0-9152f78666bf\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.613912 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.621509 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.621548 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgxrv\" (UniqueName: \"kubernetes.io/projected/6198133a-f3a2-4b73-b4f7-103c5da5e684-kube-api-access-dgxrv\") pod \"marketplace-operator-79b997595-rk9ts\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:31 crc kubenswrapper[4961]: W0201 19:36:31.628595 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5420df36_687d_4d73_9628_3ebe3597e733.slice/crio-f4bfe4577a4beb55866ebfa1662cef7ac4799b1282fb1ebde7a365e21eb224c8 WatchSource:0}: Error finding container f4bfe4577a4beb55866ebfa1662cef7ac4799b1282fb1ebde7a365e21eb224c8: Status 404 returned error can't find the container with id f4bfe4577a4beb55866ebfa1662cef7ac4799b1282fb1ebde7a365e21eb224c8 Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.638433 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.638936 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.138918671 +0000 UTC m=+146.663950094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.648616 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.649063 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kctkk\" (UniqueName: \"kubernetes.io/projected/1a177e48-a514-480b-8315-2ad8d69c0cd4-kube-api-access-kctkk\") pod \"multus-admission-controller-857f4d67dd-gl2kg\" (UID: \"1a177e48-a514-480b-8315-2ad8d69c0cd4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.661769 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.669167 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvzgg\" (UniqueName: \"kubernetes.io/projected/12553b4f-8c6c-4df6-ae4e-62af8a7354ce-kube-api-access-wvzgg\") pod \"service-ca-operator-777779d784-zdws9\" (UID: \"12553b4f-8c6c-4df6-ae4e-62af8a7354ce\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.690887 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxtq8\" (UniqueName: \"kubernetes.io/projected/9ab40180-9214-4fb8-a267-1ff710ad04e4-kube-api-access-zxtq8\") pod \"csi-hostpathplugin-48wsp\" (UID: \"9ab40180-9214-4fb8-a267-1ff710ad04e4\") " pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.720266 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.729972 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5wfx\" (UniqueName: \"kubernetes.io/projected/7431686b-4741-4c3d-852c-45e2cb99de38-kube-api-access-m5wfx\") pod \"package-server-manager-789f6589d5-s6pw4\" (UID: \"7431686b-4741-4c3d-852c-45e2cb99de38\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.730439 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-pn8rk"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.741851 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.743316 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.243086468 +0000 UTC m=+146.768117891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.829940 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj"] Feb 01 19:36:31 crc kubenswrapper[4961]: W0201 19:36:31.838811 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda8e51d5_7a7d_436d_ab17_ac15406f4e03.slice/crio-0566bd36baff3ba0504d0e2be5ef08a1ca85c752cc2dde0caed804b9ee2333a6 WatchSource:0}: Error finding container 0566bd36baff3ba0504d0e2be5ef08a1ca85c752cc2dde0caed804b9ee2333a6: Status 404 returned error can't find the container with id 0566bd36baff3ba0504d0e2be5ef08a1ca85c752cc2dde0caed804b9ee2333a6 Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.844483 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.845414 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.846120 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.346101673 +0000 UTC m=+146.871133086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.849240 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.861466 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.867976 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.889014 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:31 crc kubenswrapper[4961]: W0201 19:36:31.905110 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5095491b_50cd_4181_97ac_da46cc4608ca.slice/crio-45ee8a61d465eeb148a19e688f5cf92505c299b6dc94d3359e8e42271673a8b3 WatchSource:0}: Error finding container 45ee8a61d465eeb148a19e688f5cf92505c299b6dc94d3359e8e42271673a8b3: Status 404 returned error can't find the container with id 45ee8a61d465eeb148a19e688f5cf92505c299b6dc94d3359e8e42271673a8b3 Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.926004 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lc945"] Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.938786 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.938904 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.955522 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.956849 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:31 crc kubenswrapper[4961]: E0201 19:36:31.957305 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.457286866 +0000 UTC m=+146.982318289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:31 crc kubenswrapper[4961]: I0201 19:36:31.981269 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-48wsp" Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.000030 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dw6q4"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.029317 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.058985 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.062404 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.562365679 +0000 UTC m=+147.087397102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.062500 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.062978 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.562952725 +0000 UTC m=+147.087984368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.082248 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.119596 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.125295 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lczb5" event={"ID":"ec361338-d606-471f-acfe-9f94619f21f3","Type":"ContainerStarted","Data":"01303f83142491a5c0ce7de2c781ce93dd453fe5ce14be73b7548737a42ed830"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.135947 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" event={"ID":"e705ec1a-dabb-4f4f-942a-66626c9c65d4","Type":"ContainerStarted","Data":"aeacb5e8a90e94a752628130e5ef172cb4324546ec11b8a8e9171bad5f8acc26"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.135991 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" event={"ID":"e705ec1a-dabb-4f4f-942a-66626c9c65d4","Type":"ContainerStarted","Data":"4686189ad58deec8bc5106452fccd6dbf3700b495c6318b0cdeff9efcae3be73"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.137248 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" event={"ID":"11a86cbd-91cd-4966-a44d-423fbdb252d7","Type":"ContainerStarted","Data":"4a19bdfcb74e49bf31203581caedc9b8440a94b09bd2eaa10a59af8ea6ebf6fe"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.138667 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" event={"ID":"89fc5280-7863-4e88-bbe6-b6f82128d057","Type":"ContainerStarted","Data":"d263d7beafc82c3cde5b54fff75c52f22d2c2cd4b59545963710497bc2646ae6"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.139964 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" event={"ID":"8ed21892-8bb6-46f1-a593-16b869fb63d3","Type":"ContainerStarted","Data":"1a02984071bbcb7e3f33aee3e087bba906702c562563a4fd2179cda72c78aab7"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.142104 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" event={"ID":"be03c00f-3bf0-4698-afe4-201249fe36a7","Type":"ContainerStarted","Data":"e1ef06f5d09589a522250a92d1173810a7aaaf8b3ddb551c23709c5db3e9c150"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.143866 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pn8rk" event={"ID":"b163a77e-e04e-46ab-ac6c-bfadbe238992","Type":"ContainerStarted","Data":"bdba94c606d317afcac3ac1ca4a772f31be5f55de810f1b542d77260364b82bd"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.147171 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" event={"ID":"1bf30422-ee8e-4982-948c-99fd27354727","Type":"ContainerStarted","Data":"41d93c4d86b559c25641ec839675b7588e9b5b0b0b3e346404f1c9e26faea63d"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.147200 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" event={"ID":"1bf30422-ee8e-4982-948c-99fd27354727","Type":"ContainerStarted","Data":"a169a8dfcd918f4e9d2bf46a6f73662a2242335ba70f867e33f4e44a3af4fe71"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.149146 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" event={"ID":"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2","Type":"ContainerStarted","Data":"f8190ca63d9d2baf892eb6bf140c1c7b4d96cd026f317c37502c75b610527ac5"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.149170 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" event={"ID":"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2","Type":"ContainerStarted","Data":"43a89195753364d69c5ba91a44f1b3167c1dcc87bd4a97c87b529b1240586ddc"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.150009 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" event={"ID":"613fa395-90dd-4c22-8326-57e4dcb5f79f","Type":"ContainerStarted","Data":"865a153eb7e03abe94d92983ca9c7a34434ad2f5264ea1d6ee10a8290be38442"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.152566 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" event={"ID":"d9537efb-5e4e-414f-a0f2-7e566511b95c","Type":"ContainerStarted","Data":"3f375526bd1cbd0855403f66360dccad428fb2fcd64d10cff2f923b40c2551c6"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.152642 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" event={"ID":"d9537efb-5e4e-414f-a0f2-7e566511b95c","Type":"ContainerStarted","Data":"81ee2afe53d8db3487be2a024f2913094ea32533986710fba567fd45df261f78"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.156939 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tjxcx" event={"ID":"da8e51d5-7a7d-436d-ab17-ac15406f4e03","Type":"ContainerStarted","Data":"0566bd36baff3ba0504d0e2be5ef08a1ca85c752cc2dde0caed804b9ee2333a6"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.157845 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" event={"ID":"20a2004c-d84f-4be5-b205-70e27386ad37","Type":"ContainerStarted","Data":"5956def3689d2176baf87cb40e98b6ccfa62593259a3c0fd9e81aa596c738b23"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.160962 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-v684w" event={"ID":"a6e8e0a4-fb78-4de0-8a48-3283e2911ac0","Type":"ContainerStarted","Data":"3cf81a5e43d2dfbe702d9c259a94338a09c52e5895594889e66071e3d1f8da43"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.161671 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-v684w" Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.162967 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.163116 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" event={"ID":"badc8c22-a530-44b6-9dfc-9fba07b633e8","Type":"ContainerStarted","Data":"92ad8ab44e8b101059a7355490f004e14674cfb59ea24dd8f748fc311b7d90c8"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.163185 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" event={"ID":"badc8c22-a530-44b6-9dfc-9fba07b633e8","Type":"ContainerStarted","Data":"e71e7088472da963da9fadd2fc808cecb5f87648090d9db10e03b956e328d3ae"} Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.163158 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.663135791 +0000 UTC m=+147.188167214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.163335 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.163679 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.663664996 +0000 UTC m=+147.188696419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.164530 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" event={"ID":"fb1052b2-d1b5-4437-b80c-0b9112d041c8","Type":"ContainerStarted","Data":"d6494111a71f6fb6b732ddf8c55c7e2fd3a6bab4807ed91312ffba5b12709763"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.168115 4961 patch_prober.go:28] interesting pod/downloads-7954f5f757-v684w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.168171 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v684w" podUID="a6e8e0a4-fb78-4de0-8a48-3283e2911ac0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.170601 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" event={"ID":"adda0b5e-49cb-4a97-a76c-0f19caddf65a","Type":"ContainerStarted","Data":"6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.170659 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" event={"ID":"adda0b5e-49cb-4a97-a76c-0f19caddf65a","Type":"ContainerStarted","Data":"222f5be60ca0378c8ba260383564f4215134a68a6eb6b83a3d406ac2b7edf1d9"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.172309 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.175021 4961 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-9qwhp container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.175068 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" podUID="adda0b5e-49cb-4a97-a76c-0f19caddf65a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.185645 4961 generic.go:334] "Generic (PLEG): container finished" podID="ad1ae82a-4644-4f94-bc94-14605d3acfaa" containerID="c745eec8fb724d79d7d045bb06f21703c19fd2ffddb85573367a5d2ea06af51e" exitCode=0 Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.185725 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" event={"ID":"ad1ae82a-4644-4f94-bc94-14605d3acfaa","Type":"ContainerDied","Data":"c745eec8fb724d79d7d045bb06f21703c19fd2ffddb85573367a5d2ea06af51e"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.192729 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" event={"ID":"5095491b-50cd-4181-97ac-da46cc4608ca","Type":"ContainerStarted","Data":"45ee8a61d465eeb148a19e688f5cf92505c299b6dc94d3359e8e42271673a8b3"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.212019 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" event={"ID":"5420df36-687d-4d73-9628-3ebe3597e733","Type":"ContainerStarted","Data":"f4bfe4577a4beb55866ebfa1662cef7ac4799b1282fb1ebde7a365e21eb224c8"} Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.266284 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.267030 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.7669996 +0000 UTC m=+147.292031033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.268140 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.269663 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.769649264 +0000 UTC m=+147.294680687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.369657 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.369774 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.869751957 +0000 UTC m=+147.394783380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.370404 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.373384 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.873366848 +0000 UTC m=+147.398398271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.374859 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t"] Feb 01 19:36:32 crc kubenswrapper[4961]: W0201 19:36:32.426472 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fa93ea5_1dca_4474_a0a7_be6ae2807e68.slice/crio-81caf55ab2252446b3a4e6496b56be669d0eb294d47d89fc20cbbb494cd396f4 WatchSource:0}: Error finding container 81caf55ab2252446b3a4e6496b56be669d0eb294d47d89fc20cbbb494cd396f4: Status 404 returned error can't find the container with id 81caf55ab2252446b3a4e6496b56be669d0eb294d47d89fc20cbbb494cd396f4 Feb 01 19:36:32 crc kubenswrapper[4961]: W0201 19:36:32.458094 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8361af84_73d3_4429_8158_929d70adc1c4.slice/crio-95c422b373f728e448e57ebedcbf729742646899e75b1e07a5c9a95501addaf8 WatchSource:0}: Error finding container 95c422b373f728e448e57ebedcbf729742646899e75b1e07a5c9a95501addaf8: Status 404 returned error can't find the container with id 95c422b373f728e448e57ebedcbf729742646899e75b1e07a5c9a95501addaf8 Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.471436 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.471672 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.97162519 +0000 UTC m=+147.496656613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.472105 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.472643 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:32.972635388 +0000 UTC m=+147.497666811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.543367 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.562969 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.573578 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.573738 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.073707889 +0000 UTC m=+147.598739312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.573920 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.576686 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.076671572 +0000 UTC m=+147.601702995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.600197 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.659518 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.667980 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.674762 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.675272 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.175247582 +0000 UTC m=+147.700279005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.679318 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-mjk8q"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.704517 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.779904 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.782886 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.280460178 +0000 UTC m=+147.805491601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.829970 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wgjjb"] Feb 01 19:36:32 crc kubenswrapper[4961]: W0201 19:36:32.856951 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd85040db_48ad_439c_aed0_9152f78666bf.slice/crio-179d81882c49ce9a1e4c01c78a131fc02ef547b9b86886afc5b588620c02cb7c WatchSource:0}: Error finding container 179d81882c49ce9a1e4c01c78a131fc02ef547b9b86886afc5b588620c02cb7c: Status 404 returned error can't find the container with id 179d81882c49ce9a1e4c01c78a131fc02ef547b9b86886afc5b588620c02cb7c Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.881677 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.882347 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.382317291 +0000 UTC m=+147.907348714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.882482 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.882864 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.382842366 +0000 UTC m=+147.907873789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:32 crc kubenswrapper[4961]: W0201 19:36:32.885087 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b696447_c20a_4474_b3ed_2286d1e258fb.slice/crio-935eb796a75d5e6457ff4b99dfa76de6a5bd85d3f8aa73f70dd4b31558d84629 WatchSource:0}: Error finding container 935eb796a75d5e6457ff4b99dfa76de6a5bd85d3f8aa73f70dd4b31558d84629: Status 404 returned error can't find the container with id 935eb796a75d5e6457ff4b99dfa76de6a5bd85d3f8aa73f70dd4b31558d84629 Feb 01 19:36:32 crc kubenswrapper[4961]: W0201 19:36:32.893536 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda61e77b8_2380_46cc_bca0_4ba53bd9b82d.slice/crio-02585f99921ea1b80489a3c9662215c4beb122efbcb8d95a9fdd1549846077c2 WatchSource:0}: Error finding container 02585f99921ea1b80489a3c9662215c4beb122efbcb8d95a9fdd1549846077c2: Status 404 returned error can't find the container with id 02585f99921ea1b80489a3c9662215c4beb122efbcb8d95a9fdd1549846077c2 Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.932697 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp"] Feb 01 19:36:32 crc kubenswrapper[4961]: I0201 19:36:32.983630 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:32 crc kubenswrapper[4961]: E0201 19:36:32.984092 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.484074541 +0000 UTC m=+148.009105964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: W0201 19:36:33.010440 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6342cfd5_2f00_4e99_ae24_bc7d8bbee4ab.slice/crio-fee86ad86640fc82581a11ee41aa37e4439b8ccc2347ce7142d387cd5176b265 WatchSource:0}: Error finding container fee86ad86640fc82581a11ee41aa37e4439b8ccc2347ce7142d387cd5176b265: Status 404 returned error can't find the container with id fee86ad86640fc82581a11ee41aa37e4439b8ccc2347ce7142d387cd5176b265 Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.084815 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.085342 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.585318036 +0000 UTC m=+148.110349459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.107208 4961 csr.go:261] certificate signing request csr-s85ws is approved, waiting to be issued Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.109999 4961 csr.go:257] certificate signing request csr-s85ws is issued Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.175514 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fxmpb" podStartSLOduration=126.175492831 podStartE2EDuration="2m6.175492831s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.174579716 +0000 UTC m=+147.699611159" watchObservedRunningTime="2026-02-01 19:36:33.175492831 +0000 UTC m=+147.700524254" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.185824 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.189365 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.689323438 +0000 UTC m=+148.214354861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.244854 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-pn8rk" event={"ID":"b163a77e-e04e-46ab-ac6c-bfadbe238992","Type":"ContainerStarted","Data":"150e629c6cd52f53bb849191a81a8cc04a18c1f58ccef0cba102f8db47b0c318"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.245324 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.246485 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-898rx" podStartSLOduration=126.246464789 podStartE2EDuration="2m6.246464789s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.225594594 +0000 UTC m=+147.750626017" watchObservedRunningTime="2026-02-01 19:36:33.246464789 +0000 UTC m=+147.771496212" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.251810 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dw6q4" event={"ID":"01c9f61a-85b0-499a-9885-1a02903ef457","Type":"ContainerStarted","Data":"83fcb34cfff9743432577ef319f7b7bdbb8c20e0885299eb487516745f08db23"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.257990 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-v684w" podStartSLOduration=126.257970861 podStartE2EDuration="2m6.257970861s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.255398869 +0000 UTC m=+147.780430292" watchObservedRunningTime="2026-02-01 19:36:33.257970861 +0000 UTC m=+147.783002284" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.289535 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" event={"ID":"20a2004c-d84f-4be5-b205-70e27386ad37","Type":"ContainerStarted","Data":"5e9de551bddfbf9190d413e647e605f55a919f1904ecf91884cafc1a6b725d07"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.290462 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.294789 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.295071 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.79505969 +0000 UTC m=+148.320091113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.297049 4961 patch_prober.go:28] interesting pod/console-operator-58897d9998-pn8rk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.297081 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-pn8rk" podUID="b163a77e-e04e-46ab-ac6c-bfadbe238992" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.297825 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" event={"ID":"a61e77b8-2380-46cc-bca0-4ba53bd9b82d","Type":"ContainerStarted","Data":"02585f99921ea1b80489a3c9662215c4beb122efbcb8d95a9fdd1549846077c2"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.301089 4961 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-597bm container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.301174 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" podUID="20a2004c-d84f-4be5-b205-70e27386ad37" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.301635 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" event={"ID":"11a86cbd-91cd-4966-a44d-423fbdb252d7","Type":"ContainerStarted","Data":"fb2a485d20f2d92f05ef9ffc7d78957fc4b0b82931476a5ac7c7892bca4657fe"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.311649 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" podStartSLOduration=126.311627874 podStartE2EDuration="2m6.311627874s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.308392983 +0000 UTC m=+147.833424406" watchObservedRunningTime="2026-02-01 19:36:33.311627874 +0000 UTC m=+147.836659297" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.378178 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qjgjt" podStartSLOduration=126.378145777 podStartE2EDuration="2m6.378145777s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.368615109 +0000 UTC m=+147.893646532" watchObservedRunningTime="2026-02-01 19:36:33.378145777 +0000 UTC m=+147.903177200" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.384785 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" event={"ID":"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab","Type":"ContainerStarted","Data":"fee86ad86640fc82581a11ee41aa37e4439b8ccc2347ce7142d387cd5176b265"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.395501 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.395759 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.895710428 +0000 UTC m=+148.420741991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.395918 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.397007 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.896994754 +0000 UTC m=+148.422026397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.403072 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" event={"ID":"613fa395-90dd-4c22-8326-57e4dcb5f79f","Type":"ContainerStarted","Data":"28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.404300 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.412969 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7hprk" podStartSLOduration=126.412951051 podStartE2EDuration="2m6.412951051s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.40471953 +0000 UTC m=+147.929750953" watchObservedRunningTime="2026-02-01 19:36:33.412951051 +0000 UTC m=+147.937982464" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.415194 4961 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-nbv69 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.415241 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" podUID="613fa395-90dd-4c22-8326-57e4dcb5f79f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.422224 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" event={"ID":"5420df36-687d-4d73-9628-3ebe3597e733","Type":"ContainerStarted","Data":"0f341c9a2f4c9f1ea473fc11e2da5189f1a298875541caadd36ebcdfc1c1e0b4"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.437221 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" event={"ID":"d85040db-48ad-439c-aed0-9152f78666bf","Type":"ContainerStarted","Data":"179d81882c49ce9a1e4c01c78a131fc02ef547b9b86886afc5b588620c02cb7c"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.455351 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" podStartSLOduration=126.455320367 podStartE2EDuration="2m6.455320367s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.454431123 +0000 UTC m=+147.979462546" watchObservedRunningTime="2026-02-01 19:36:33.455320367 +0000 UTC m=+147.980351790" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.456287 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" event={"ID":"72e4afaf-4ac5-4c16-899a-c19b7a0550cf","Type":"ContainerStarted","Data":"6206bc584f6431a2ecbcf9142af9c9c3a9d76f3119d99e7acba6560a09872948"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.497895 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.498319 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:33.998285691 +0000 UTC m=+148.523317114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.499347 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.500582 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.000556294 +0000 UTC m=+148.525587717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.515858 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-pn8rk" podStartSLOduration=126.515837332 podStartE2EDuration="2m6.515837332s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.481734958 +0000 UTC m=+148.006766381" watchObservedRunningTime="2026-02-01 19:36:33.515837332 +0000 UTC m=+148.040868755" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.517710 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" podStartSLOduration=126.517701965 podStartE2EDuration="2m6.517701965s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.515949116 +0000 UTC m=+148.040980539" watchObservedRunningTime="2026-02-01 19:36:33.517701965 +0000 UTC m=+148.042733388" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.531929 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-tjxcx" event={"ID":"da8e51d5-7a7d-436d-ab17-ac15406f4e03","Type":"ContainerStarted","Data":"17c32cdf1d2ae097ad7261c9cb8485b8e4f45b69d96865c662941c522a7d4604"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.545152 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-48wsp"] Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.566851 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.567025 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.567100 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.588600 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lczb5" event={"ID":"ec361338-d606-471f-acfe-9f94619f21f3","Type":"ContainerStarted","Data":"5b83627acca61e58bc39f7dc1f75d02a43cfdcbe22f1c0f63568e4e2b46f8270"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.591688 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-tjxcx" podStartSLOduration=126.591672926 podStartE2EDuration="2m6.591672926s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.590656258 +0000 UTC m=+148.115687681" watchObservedRunningTime="2026-02-01 19:36:33.591672926 +0000 UTC m=+148.116704349" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.592107 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9w5c5" podStartSLOduration=126.592101738 podStartE2EDuration="2m6.592101738s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.550522703 +0000 UTC m=+148.075554126" watchObservedRunningTime="2026-02-01 19:36:33.592101738 +0000 UTC m=+148.117133161" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.600891 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.604437 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.104412712 +0000 UTC m=+148.629444135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.605072 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7" event={"ID":"30c54837-57a2-47d2-b2f0-e4c17a056b9e","Type":"ContainerStarted","Data":"6a52020692b11b66a5afaffb5fc7faa337b6ef9b5c98dbc8d6d3b7aa9dbbdcfb"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.611514 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4"] Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.615975 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-lczb5" podStartSLOduration=5.615950326 podStartE2EDuration="5.615950326s" podCreationTimestamp="2026-02-01 19:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.61255683 +0000 UTC m=+148.137588253" watchObservedRunningTime="2026-02-01 19:36:33.615950326 +0000 UTC m=+148.140981749" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.612866 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" event={"ID":"4b5cc4ba-045f-4c69-88c9-763af824474d","Type":"ContainerStarted","Data":"bb921046a984e485a40d53bb2455155292af90af2451e9a949796b9a0849c768"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.624973 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" event={"ID":"fb1052b2-d1b5-4437-b80c-0b9112d041c8","Type":"ContainerStarted","Data":"710e28e03a420feeac5a46a91547063692d6bf07950a637ce898ab5fda42a543"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.648128 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" event={"ID":"71b2ce56-de70-4686-b2d6-9c0813626125","Type":"ContainerStarted","Data":"5bcd418b9d1d1ffafac0209ba1bccb383c4409ac0e08cefdd6a668af5956febb"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.655945 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9468p" podStartSLOduration=126.655933336 podStartE2EDuration="2m6.655933336s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.654517276 +0000 UTC m=+148.179548699" watchObservedRunningTime="2026-02-01 19:36:33.655933336 +0000 UTC m=+148.180964759" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.668265 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx" event={"ID":"547d4fdd-999f-4578-9efc-38155da0b2fe","Type":"ContainerStarted","Data":"878ad436dd5f71f30565539c1f0348264694083f149f35ce39f05b20197db607"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.678876 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zdws9"] Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.692356 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" event={"ID":"86a1f5dd-69f3-4ba2-b48f-14aef559e6b2","Type":"ContainerStarted","Data":"97feb4b7a2baeddcb7156700320ea9f632004617c6502bbbdf8c408164358d8e"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.706565 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" event={"ID":"8361af84-73d3-4429-8158-929d70adc1c4","Type":"ContainerStarted","Data":"95c422b373f728e448e57ebedcbf729742646899e75b1e07a5c9a95501addaf8"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.707176 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.708817 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.208804606 +0000 UTC m=+148.733836029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.726960 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-68b9n" podStartSLOduration=126.726942094 podStartE2EDuration="2m6.726942094s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.725224776 +0000 UTC m=+148.250256199" watchObservedRunningTime="2026-02-01 19:36:33.726942094 +0000 UTC m=+148.251973517" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.730184 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" event={"ID":"8b696447-c20a-4474-b3ed-2286d1e258fb","Type":"ContainerStarted","Data":"935eb796a75d5e6457ff4b99dfa76de6a5bd85d3f8aa73f70dd4b31558d84629"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.747302 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" event={"ID":"89fc5280-7863-4e88-bbe6-b6f82128d057","Type":"ContainerStarted","Data":"ee0ad704ac42b6f57912fdf00f8af8df5c2aea47b1488c1663bb8f780b1944b2"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.759265 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rk9ts"] Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.760588 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgjjb" event={"ID":"26025115-e843-444a-9ab0-78f0a8e2f0cb","Type":"ContainerStarted","Data":"ba7aa5f53720ca6deaa6530ef9b8e18fa2e9346ea171121db5d161311556b294"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.765598 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" event={"ID":"1fa93ea5-1dca-4474-a0a7-be6ae2807e68","Type":"ContainerStarted","Data":"81caf55ab2252446b3a4e6496b56be669d0eb294d47d89fc20cbbb494cd396f4"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.783951 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gl2kg"] Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.796725 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-9cfh9" podStartSLOduration=126.796705708 podStartE2EDuration="2m6.796705708s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.781312217 +0000 UTC m=+148.306343640" watchObservedRunningTime="2026-02-01 19:36:33.796705708 +0000 UTC m=+148.321737121" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.813752 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.814022 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.313985211 +0000 UTC m=+148.839016634 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.814832 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.815237 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.315227497 +0000 UTC m=+148.840258920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.839833 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lc945" event={"ID":"74904be9-13e0-4bd0-9bf1-b0490e87aaaa","Type":"ContainerStarted","Data":"abf869115d61219f062b66c305403b793398658b6a8175e96b3b282162cfbb06"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.868554 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" event={"ID":"0ac9b2b1-423f-444c-9274-d0fdb22f2d27","Type":"ContainerStarted","Data":"d3324bcfee2635e3d572efce1dbc2527e7ef0452a40c1ffe648d1390c47e068f"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.877172 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lc945" podStartSLOduration=5.877140621 podStartE2EDuration="5.877140621s" podCreationTimestamp="2026-02-01 19:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.872507371 +0000 UTC m=+148.397538784" watchObservedRunningTime="2026-02-01 19:36:33.877140621 +0000 UTC m=+148.402172044" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.886556 4961 patch_prober.go:28] interesting pod/downloads-7954f5f757-v684w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.886879 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v684w" podUID="a6e8e0a4-fb78-4de0-8a48-3283e2911ac0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.887201 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" event={"ID":"163bcc7a-64d7-4275-a200-f41cc028e6fd","Type":"ContainerStarted","Data":"7c40097002bab383735e401565fd65286e2fb3ac4a4ab529beee6f61474b0101"} Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.905570 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.926940 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" podStartSLOduration=126.926919015 podStartE2EDuration="2m6.926919015s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:33.919929029 +0000 UTC m=+148.444960452" watchObservedRunningTime="2026-02-01 19:36:33.926919015 +0000 UTC m=+148.451950438" Feb 01 19:36:33 crc kubenswrapper[4961]: I0201 19:36:33.929438 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:33 crc kubenswrapper[4961]: E0201 19:36:33.931613 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.431595065 +0000 UTC m=+148.956626488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.030672 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:34 crc kubenswrapper[4961]: E0201 19:36:34.031655 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.531639407 +0000 UTC m=+149.056670840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.111306 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-01 19:31:33 +0000 UTC, rotation deadline is 2026-12-16 07:28:32.014238576 +0000 UTC Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.111734 4961 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7619h51m57.902507086s for next certificate rotation Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.133616 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:34 crc kubenswrapper[4961]: E0201 19:36:34.134020 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.634005854 +0000 UTC m=+149.159037277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.236815 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.236855 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.236909 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.236931 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.236952 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:34 crc kubenswrapper[4961]: E0201 19:36:34.243798 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.743780888 +0000 UTC m=+149.268812311 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.251645 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.268119 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.268898 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.269464 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.276514 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.292712 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.337944 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:34 crc kubenswrapper[4961]: E0201 19:36:34.338178 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.838162241 +0000 UTC m=+149.363193664 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.377915 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.452199 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:34 crc kubenswrapper[4961]: E0201 19:36:34.453013 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:34.952999376 +0000 UTC m=+149.478030799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.553708 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:34 crc kubenswrapper[4961]: E0201 19:36:34.555445 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:35.055414595 +0000 UTC m=+149.580446018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.555798 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:34 crc kubenswrapper[4961]: E0201 19:36:34.556066 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:35.056058423 +0000 UTC m=+149.581090006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.559888 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.582695 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:34 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:34 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:34 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.582733 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.660457 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:34 crc kubenswrapper[4961]: E0201 19:36:34.661557 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:35.161540187 +0000 UTC m=+149.686571610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.762410 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:34 crc kubenswrapper[4961]: E0201 19:36:34.762813 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:35.262800793 +0000 UTC m=+149.787832216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.863480 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:34 crc kubenswrapper[4961]: E0201 19:36:34.863822 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:35.363804181 +0000 UTC m=+149.888835604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.898226 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" event={"ID":"1fa93ea5-1dca-4474-a0a7-be6ae2807e68","Type":"ContainerStarted","Data":"376074e748a4b2a05d90ea0972e61e1ba6a8b74c2c837760da4e4641642fde4c"} Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.898288 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" event={"ID":"1fa93ea5-1dca-4474-a0a7-be6ae2807e68","Type":"ContainerStarted","Data":"fb894671a022c39d2dc08d8a6923cacf34cf2b7215317150034bb1f3082d6cf6"} Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.912214 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" event={"ID":"8361af84-73d3-4429-8158-929d70adc1c4","Type":"ContainerStarted","Data":"84350f4088bcab752c3ad632157cd3ccb434fde6e3d5cfec45ea8f1f5d5850e5"} Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.921449 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mbtkv" podStartSLOduration=127.921415425 podStartE2EDuration="2m7.921415425s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:34.91909503 +0000 UTC m=+149.444126463" watchObservedRunningTime="2026-02-01 19:36:34.921415425 +0000 UTC m=+149.446446848" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.924843 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" event={"ID":"163bcc7a-64d7-4275-a200-f41cc028e6fd","Type":"ContainerStarted","Data":"bcfa5dea00d7020bc2cd7b2f70ebcb4513894ce3636cda22e3a9159f931a196c"} Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.925253 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.937776 4961 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pkfkf container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.937962 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" podUID="163bcc7a-64d7-4275-a200-f41cc028e6fd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.940256 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" event={"ID":"0ac9b2b1-423f-444c-9274-d0fdb22f2d27","Type":"ContainerStarted","Data":"993f36043238886d93d92361e14b60449eefa8d6d9605536706f3551b75b919c"} Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.951450 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v454t" podStartSLOduration=127.951435925 podStartE2EDuration="2m7.951435925s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:34.949478811 +0000 UTC m=+149.474510234" watchObservedRunningTime="2026-02-01 19:36:34.951435925 +0000 UTC m=+149.476467348" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.982481 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:34 crc kubenswrapper[4961]: E0201 19:36:34.984213 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:35.484201513 +0000 UTC m=+150.009232926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.984998 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-mjk8q" podStartSLOduration=127.984983624 podStartE2EDuration="2m7.984983624s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:34.982058203 +0000 UTC m=+149.507089626" watchObservedRunningTime="2026-02-01 19:36:34.984983624 +0000 UTC m=+149.510015047" Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.990570 4961 generic.go:334] "Generic (PLEG): container finished" podID="8ed21892-8bb6-46f1-a593-16b869fb63d3" containerID="d2a9478280e6b462c684e98594a536efb4a997fe0c6f24d51d87fb93e45a5eaa" exitCode=0 Feb 01 19:36:34 crc kubenswrapper[4961]: I0201 19:36:34.990674 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" event={"ID":"8ed21892-8bb6-46f1-a593-16b869fb63d3","Type":"ContainerDied","Data":"d2a9478280e6b462c684e98594a536efb4a997fe0c6f24d51d87fb93e45a5eaa"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.035751 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" event={"ID":"7431686b-4741-4c3d-852c-45e2cb99de38","Type":"ContainerStarted","Data":"a98b1acc0101d18cbeca4cebc5155c9563af77a5a7328c7bd25e8ebbdf3ba65e"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.035995 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" event={"ID":"7431686b-4741-4c3d-852c-45e2cb99de38","Type":"ContainerStarted","Data":"84e5a96aea4a19951519f2dfaa6abda67867c01d69e01bc84fb0ac4c72e0d196"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.071252 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" event={"ID":"72e4afaf-4ac5-4c16-899a-c19b7a0550cf","Type":"ContainerStarted","Data":"b364594aff186b69bc5cf5cdbe3c284a319389a177712e954321867988471aa1"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.100927 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:35 crc kubenswrapper[4961]: E0201 19:36:35.102146 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:35.602130065 +0000 UTC m=+150.127161488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.118361 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8n4k2" podStartSLOduration=128.118335589 podStartE2EDuration="2m8.118335589s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.11302893 +0000 UTC m=+149.638060353" watchObservedRunningTime="2026-02-01 19:36:35.118335589 +0000 UTC m=+149.643367012" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.134139 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-48wsp" event={"ID":"9ab40180-9214-4fb8-a267-1ff710ad04e4","Type":"ContainerStarted","Data":"6e1a3657620f090b90689f4ec906413adacaa2fd1b5ffbb15f84c32a56f46943"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.154962 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" event={"ID":"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab","Type":"ContainerStarted","Data":"ba843a9f14d68760ac8af78b67818c0b0f074acad0a5d2ea031bac3dd82b29de"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.155007 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" event={"ID":"6342cfd5-2f00-4e99-ae24-bc7d8bbee4ab","Type":"ContainerStarted","Data":"b746a66e3eef2b3be6be2afeafbd16d0511de4991631eba1ad599463db85df38"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.193226 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7" event={"ID":"30c54837-57a2-47d2-b2f0-e4c17a056b9e","Type":"ContainerStarted","Data":"ec851118ca05c51bca6b129633960aed622a529d677c8bd4eba796a874dfa17d"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.193283 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7" event={"ID":"30c54837-57a2-47d2-b2f0-e4c17a056b9e","Type":"ContainerStarted","Data":"116194b9b7c00f673e3abd4af6c00a6652b84ce2bf3066f80881380bc20ce0df"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.204242 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:35 crc kubenswrapper[4961]: E0201 19:36:35.204561 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:35.704548254 +0000 UTC m=+150.229579677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.230245 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-nc9sp" podStartSLOduration=128.230226133 podStartE2EDuration="2m8.230226133s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.1965621 +0000 UTC m=+149.721593523" watchObservedRunningTime="2026-02-01 19:36:35.230226133 +0000 UTC m=+149.755257556" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.230733 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-dzhm7" podStartSLOduration=128.230727367 podStartE2EDuration="2m8.230727367s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.229851692 +0000 UTC m=+149.754883125" watchObservedRunningTime="2026-02-01 19:36:35.230727367 +0000 UTC m=+149.755758780" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.272434 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" event={"ID":"4b5cc4ba-045f-4c69-88c9-763af824474d","Type":"ContainerStarted","Data":"b9b3d527e0f9c1a466798353cfa53048f4fae4bd91273bfec11f36d7adf4d2b3"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.272500 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" event={"ID":"4b5cc4ba-045f-4c69-88c9-763af824474d","Type":"ContainerStarted","Data":"f704cd7286284c4b3215438b18ac00fe05f7cf4ee50a9e303ff8c38881def048"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.310226 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:35 crc kubenswrapper[4961]: E0201 19:36:35.311814 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:35.811799087 +0000 UTC m=+150.336830510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.314238 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" event={"ID":"badc8c22-a530-44b6-9dfc-9fba07b633e8","Type":"ContainerStarted","Data":"2e217f142c4be484f7c5cecb7070b8d7d2bcffb1b36f7432e19be125333049cd"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.333819 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ptgr7" podStartSLOduration=128.333804933 podStartE2EDuration="2m8.333804933s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.319345599 +0000 UTC m=+149.844377022" watchObservedRunningTime="2026-02-01 19:36:35.333804933 +0000 UTC m=+149.858836356" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.365645 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx" event={"ID":"547d4fdd-999f-4578-9efc-38155da0b2fe","Type":"ContainerStarted","Data":"4c34ac2d13f5697ccbe0a9615c372a63e3e189d91dd15d5328d242d6abfc5587"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.391658 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-hwdhh" podStartSLOduration=128.391641742 podStartE2EDuration="2m8.391641742s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.355544392 +0000 UTC m=+149.880575815" watchObservedRunningTime="2026-02-01 19:36:35.391641742 +0000 UTC m=+149.916673165" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.414175 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:35 crc kubenswrapper[4961]: E0201 19:36:35.415672 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:35.915659665 +0000 UTC m=+150.440691088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.421996 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" event={"ID":"8b696447-c20a-4474-b3ed-2286d1e258fb","Type":"ContainerStarted","Data":"157d06b76aec5247e648f3289e75faee6390e95f4853554a77f80671739b673a"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.438693 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" event={"ID":"ad1ae82a-4644-4f94-bc94-14605d3acfaa","Type":"ContainerStarted","Data":"74c3df156bd8947a84d7775e61a6647414bf7d595ed9709ca9046fa400daab05"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.451162 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-x2kt7" podStartSLOduration=128.45114591 podStartE2EDuration="2m8.45114591s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.448842815 +0000 UTC m=+149.973874238" watchObservedRunningTime="2026-02-01 19:36:35.45114591 +0000 UTC m=+149.976177333" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.451448 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xzdqx" podStartSLOduration=128.451443098 podStartE2EDuration="2m8.451443098s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.390801729 +0000 UTC m=+149.915833152" watchObservedRunningTime="2026-02-01 19:36:35.451443098 +0000 UTC m=+149.976474521" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.473442 4961 generic.go:334] "Generic (PLEG): container finished" podID="5095491b-50cd-4181-97ac-da46cc4608ca" containerID="57c14d0e82d2a1511e6ec4eddc5f384c39934d51a4fedef395fa233374e1830b" exitCode=0 Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.473534 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" event={"ID":"5095491b-50cd-4181-97ac-da46cc4608ca","Type":"ContainerDied","Data":"57c14d0e82d2a1511e6ec4eddc5f384c39934d51a4fedef395fa233374e1830b"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.473584 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" event={"ID":"5095491b-50cd-4181-97ac-da46cc4608ca","Type":"ContainerStarted","Data":"f14f9042c1e879ff12125b3d67f7e2de6100474f201e04ab931ef594813c1ada"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.474407 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.500339 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" event={"ID":"a61e77b8-2380-46cc-bca0-4ba53bd9b82d","Type":"ContainerStarted","Data":"bfb6e05c7c4def417ee1d3dd765f1933f2ea0c967420655613a774cb40db135b"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.501260 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.510141 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" podStartSLOduration=128.510120281 podStartE2EDuration="2m8.510120281s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.501453928 +0000 UTC m=+150.026485351" watchObservedRunningTime="2026-02-01 19:36:35.510120281 +0000 UTC m=+150.035151704" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.513148 4961 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-7q6tm container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.513198 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" podUID="a61e77b8-2380-46cc-bca0-4ba53bd9b82d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.514832 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:35 crc kubenswrapper[4961]: E0201 19:36:35.515961 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.015947824 +0000 UTC m=+150.540979247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.554275 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lc945" event={"ID":"74904be9-13e0-4bd0-9bf1-b0490e87aaaa","Type":"ContainerStarted","Data":"df2d243282d86d2dac097ae0a06d0d8ac1d0f7a528dea8557463999626d8ff77"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.559279 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" event={"ID":"d85040db-48ad-439c-aed0-9152f78666bf","Type":"ContainerStarted","Data":"ed2fcd48953ddf13f29ef631188596e89e2ff0a7a69d89863b734ac8e0c748eb"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.567491 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" podStartSLOduration=128.567471247 podStartE2EDuration="2m8.567471247s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.545425479 +0000 UTC m=+150.070456902" watchObservedRunningTime="2026-02-01 19:36:35.567471247 +0000 UTC m=+150.092502670" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.569116 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:35 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:35 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:35 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.569214 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.607841 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" podStartSLOduration=128.607824127 podStartE2EDuration="2m8.607824127s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.569343119 +0000 UTC m=+150.094374542" watchObservedRunningTime="2026-02-01 19:36:35.607824127 +0000 UTC m=+150.132855550" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.608796 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5fwwn" podStartSLOduration=128.608791284 podStartE2EDuration="2m8.608791284s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.607445906 +0000 UTC m=+150.132477329" watchObservedRunningTime="2026-02-01 19:36:35.608791284 +0000 UTC m=+150.133822707" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.617889 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:35 crc kubenswrapper[4961]: E0201 19:36:35.619268 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.119255687 +0000 UTC m=+150.644287110 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.629646 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" event={"ID":"12553b4f-8c6c-4df6-ae4e-62af8a7354ce","Type":"ContainerStarted","Data":"27a48a0427b48431a9b555a499d1b8e9d84f4e8d76844039fbfaa814938da95d"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.629686 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" event={"ID":"12553b4f-8c6c-4df6-ae4e-62af8a7354ce","Type":"ContainerStarted","Data":"29f41af3ed5bbe5822bca395293c50fe1b9ad6a83301218506f9c23a0b81ce1c"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.659774 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.661174 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" event={"ID":"71b2ce56-de70-4686-b2d6-9c0813626125","Type":"ContainerStarted","Data":"a44276729b0a8936b6faf1d3804bd6fd34c5bf001affdd15d4e5d20d96ef9726"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.664150 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zdws9" podStartSLOduration=128.664135074 podStartE2EDuration="2m8.664135074s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.662050565 +0000 UTC m=+150.187081988" watchObservedRunningTime="2026-02-01 19:36:35.664135074 +0000 UTC m=+150.189166497" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.685750 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" event={"ID":"be03c00f-3bf0-4698-afe4-201249fe36a7","Type":"ContainerStarted","Data":"2713931c99e90e8105f56013d5bb9a64813e5dac535850cda522a005272ba78a"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.686424 4961 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wfpd6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.686451 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" podUID="71b2ce56-de70-4686-b2d6-9c0813626125" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.706415 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dw6q4" event={"ID":"01c9f61a-85b0-499a-9885-1a02903ef457","Type":"ContainerStarted","Data":"7b1a03fe174d3006f6198f340789cf100cb3837107563c114424b3bad05bd38a"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.706567 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" podStartSLOduration=128.706553622 podStartE2EDuration="2m8.706553622s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.704879345 +0000 UTC m=+150.229910758" watchObservedRunningTime="2026-02-01 19:36:35.706553622 +0000 UTC m=+150.231585045" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.719445 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.722482 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:35 crc kubenswrapper[4961]: E0201 19:36:35.723799 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.223785124 +0000 UTC m=+150.748816547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.724490 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.748583 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" podStartSLOduration=128.748554978 podStartE2EDuration="2m8.748554978s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.744757981 +0000 UTC m=+150.269789404" watchObservedRunningTime="2026-02-01 19:36:35.748554978 +0000 UTC m=+150.273586401" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.751356 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" event={"ID":"1a177e48-a514-480b-8315-2ad8d69c0cd4","Type":"ContainerStarted","Data":"b3f4a2701e999e874189b3ba11b1d4310aad4ce1651080482a1dc65a44207a5a"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.751411 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" event={"ID":"1a177e48-a514-480b-8315-2ad8d69c0cd4","Type":"ContainerStarted","Data":"c110a8b2748e404937a48e690f06070443996615d8ccde95298c4d789a238486"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.774447 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-dw6q4" podStartSLOduration=128.774428432 podStartE2EDuration="2m8.774428432s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.773462376 +0000 UTC m=+150.298493799" watchObservedRunningTime="2026-02-01 19:36:35.774428432 +0000 UTC m=+150.299459865" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.779593 4961 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-d28vq container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.779646 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" podUID="ad1ae82a-4644-4f94-bc94-14605d3acfaa" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.787733 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" event={"ID":"6198133a-f3a2-4b73-b4f7-103c5da5e684","Type":"ContainerStarted","Data":"42acd8df684667178a16f73d4d7f3611feddbec330a529ff052adcf8fa6e7550"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.787780 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" event={"ID":"6198133a-f3a2-4b73-b4f7-103c5da5e684","Type":"ContainerStarted","Data":"84eec27ac48b95a31532f8d0659fccd7b0dda06951be74164e18a482179b007a"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.788858 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.801670 4961 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rk9ts container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.801717 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" podUID="6198133a-f3a2-4b73-b4f7-103c5da5e684" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.817527 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgjjb" event={"ID":"26025115-e843-444a-9ab0-78f0a8e2f0cb","Type":"ContainerStarted","Data":"3a6cfc840af1a03e444ecad3b9b63b3a01a585be9952f3a83677415e082b4616"} Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.818716 4961 patch_prober.go:28] interesting pod/downloads-7954f5f757-v684w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.818768 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v684w" podUID="a6e8e0a4-fb78-4de0-8a48-3283e2911ac0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.822263 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:35 crc kubenswrapper[4961]: E0201 19:36:35.836550 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.336524781 +0000 UTC m=+150.861556214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.841425 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.844454 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.913502 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" podStartSLOduration=128.913481447 podStartE2EDuration="2m8.913481447s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.876951954 +0000 UTC m=+150.401983377" watchObservedRunningTime="2026-02-01 19:36:35.913481447 +0000 UTC m=+150.438512870" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.914411 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" podStartSLOduration=128.914405062 podStartE2EDuration="2m8.914405062s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:35.908624081 +0000 UTC m=+150.433655504" watchObservedRunningTime="2026-02-01 19:36:35.914405062 +0000 UTC m=+150.439436485" Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.925170 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:35 crc kubenswrapper[4961]: E0201 19:36:35.925370 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.425343499 +0000 UTC m=+150.950374922 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:35 crc kubenswrapper[4961]: I0201 19:36:35.926659 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:35 crc kubenswrapper[4961]: E0201 19:36:35.970116 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.470099712 +0000 UTC m=+150.995131135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.018506 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wgjjb" podStartSLOduration=8.018490617 podStartE2EDuration="8.018490617s" podCreationTimestamp="2026-02-01 19:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:36.001491751 +0000 UTC m=+150.526523174" watchObservedRunningTime="2026-02-01 19:36:36.018490617 +0000 UTC m=+150.543522030" Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.027552 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.028200 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.528183739 +0000 UTC m=+151.053215162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.062280 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-pn8rk" Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.128970 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.129286 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.62927472 +0000 UTC m=+151.154306143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.230546 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.230710 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.73067945 +0000 UTC m=+151.255710863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.230926 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.231381 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.731364779 +0000 UTC m=+151.256396202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.363277 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.363980 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.863961302 +0000 UTC m=+151.388992725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.465054 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.465324 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:36.965313181 +0000 UTC m=+151.490344604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.566482 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.567011 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.066987918 +0000 UTC m=+151.592019331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.571410 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:36 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:36 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:36 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.571647 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.668701 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.669048 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.169016284 +0000 UTC m=+151.694047707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.769657 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.769921 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.269879639 +0000 UTC m=+151.794911062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.770117 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.770604 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.270579699 +0000 UTC m=+151.795611122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.823351 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d000a292fb14ce48a3ba6aee3cbc1fa9964ae1849c11b75dcdf5ca042a1ab928"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.823425 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f4ecffaf2796a688ada0fd87e7cb8a7c3ff033bd8ab890b41b23e44640ef8379"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.827395 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" event={"ID":"7431686b-4741-4c3d-852c-45e2cb99de38","Type":"ContainerStarted","Data":"1dcbeda87c130c32c879d46b76f539124013960721dbfa7556482da9d6d2d725"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.827571 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.830731 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wgjjb" event={"ID":"26025115-e843-444a-9ab0-78f0a8e2f0cb","Type":"ContainerStarted","Data":"8ddaadde6b82bb9ba5c602620e6b60d27c504f931eb6db32eaa5ce25ac870404"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.830941 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.832751 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c7cd6e0ee3f0a3c3c054d44841eaa802a6a5a3ca84e2161829c8fc2d16f6d2ca"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.832816 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e8994cb9a43cd1104066657dd04961fa830bcba62b840acc79d34cdf4539086d"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.834694 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"08f317e7b02b29ac247bbf0291ab64f0c3626995056d1f6a84aec80e6ab10044"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.834724 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9468fadd4f6e5e13f325b2d96ee5a33074a859a18ff8f0ba0775e8fa351b601b"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.834908 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.849563 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" event={"ID":"8ed21892-8bb6-46f1-a593-16b869fb63d3","Type":"ContainerStarted","Data":"73bfc99c89866e0387e344f85cb3f9f45a5d8bee8053c24840139c175d3f9a9e"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.849613 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" event={"ID":"8ed21892-8bb6-46f1-a593-16b869fb63d3","Type":"ContainerStarted","Data":"45c813973d9e6af252e7b4c69b784f590bce99850ece5dac3d81361936788bce"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.861094 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gl2kg" event={"ID":"1a177e48-a514-480b-8315-2ad8d69c0cd4","Type":"ContainerStarted","Data":"1f9d3a4e6e0bc6bc702bc6fab5b16de9da0d42a2ac1c56ba9ea57c854018442b"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.869746 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-48wsp" event={"ID":"9ab40180-9214-4fb8-a267-1ff710ad04e4","Type":"ContainerStarted","Data":"b464ee267a332da0afb2aa95ff3e8d79a24a4827aed74efebc8610c85782a80c"} Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.874225 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.874401 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.374378326 +0000 UTC m=+151.899409749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.874454 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.874883 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.37486617 +0000 UTC m=+151.899897593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.875205 4961 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rk9ts container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.875245 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" podUID="6198133a-f3a2-4b73-b4f7-103c5da5e684" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.887020 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-7q6tm" Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.887203 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pkfkf" Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.891527 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wfpd6" Feb 01 19:36:36 crc kubenswrapper[4961]: I0201 19:36:36.975233 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:36 crc kubenswrapper[4961]: E0201 19:36:36.978281 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.478264065 +0000 UTC m=+152.003295488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.036249 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" podStartSLOduration=130.036214467 podStartE2EDuration="2m10.036214467s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:37.025380405 +0000 UTC m=+151.550411838" watchObservedRunningTime="2026-02-01 19:36:37.036214467 +0000 UTC m=+151.561245890" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.082271 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.082775 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.582761581 +0000 UTC m=+152.107793004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.115963 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" podStartSLOduration=130.115945831 podStartE2EDuration="2m10.115945831s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:37.073896973 +0000 UTC m=+151.598928396" watchObservedRunningTime="2026-02-01 19:36:37.115945831 +0000 UTC m=+151.640977254" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.184115 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.184402 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.684386107 +0000 UTC m=+152.209417530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.290796 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.291113 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.791100495 +0000 UTC m=+152.316131918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.336932 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.346754 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.356478 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.356811 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.385918 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.391382 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.391741 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.891724283 +0000 UTC m=+152.416755706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.492761 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7ef56ba-e930-4050-9544-77be701ea9fb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e7ef56ba-e930-4050-9544-77be701ea9fb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.492870 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7ef56ba-e930-4050-9544-77be701ea9fb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e7ef56ba-e930-4050-9544-77be701ea9fb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.492926 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.493326 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:37.993310189 +0000 UTC m=+152.518341612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.581576 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:37 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:37 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:37 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.581652 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.594222 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.594407 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.094381939 +0000 UTC m=+152.619413362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.594446 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7ef56ba-e930-4050-9544-77be701ea9fb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e7ef56ba-e930-4050-9544-77be701ea9fb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.594556 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7ef56ba-e930-4050-9544-77be701ea9fb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e7ef56ba-e930-4050-9544-77be701ea9fb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.594638 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.594967 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.094959925 +0000 UTC m=+152.619991348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.595241 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7ef56ba-e930-4050-9544-77be701ea9fb-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e7ef56ba-e930-4050-9544-77be701ea9fb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.649297 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7ef56ba-e930-4050-9544-77be701ea9fb-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e7ef56ba-e930-4050-9544-77be701ea9fb\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.694638 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.696012 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.696268 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.196216771 +0000 UTC m=+152.721248194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.696403 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.696870 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.196861418 +0000 UTC m=+152.721892841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.766613 4961 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fl9kc container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 01 19:36:37 crc kubenswrapper[4961]: [+]log ok Feb 01 19:36:37 crc kubenswrapper[4961]: [-]poststarthook/max-in-flight-filter failed: reason withheld Feb 01 19:36:37 crc kubenswrapper[4961]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 01 19:36:37 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.767115 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" podUID="5095491b-50cd-4181-97ac-da46cc4608ca" containerName="openshift-config-operator" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.790326 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fl9kc" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.798252 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.798590 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.298551127 +0000 UTC m=+152.823582560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.798665 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.799120 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.299104112 +0000 UTC m=+152.824135535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.833794 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rldqh"] Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.835007 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.840557 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.882152 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rldqh"] Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.900212 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:37 crc kubenswrapper[4961]: E0201 19:36:37.900595 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.400580414 +0000 UTC m=+152.925611837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.923223 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-48wsp" event={"ID":"9ab40180-9214-4fb8-a267-1ff710ad04e4","Type":"ContainerStarted","Data":"e7592c0b80208b716a49b08a8581ab1b49b2fca593c6b4097638a09e3251a53f"} Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.925701 4961 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rk9ts container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Feb 01 19:36:37 crc kubenswrapper[4961]: I0201 19:36:37.925741 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" podUID="6198133a-f3a2-4b73-b4f7-103c5da5e684" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.003160 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-utilities\") pod \"community-operators-rldqh\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.003255 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.003280 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-catalog-content\") pod \"community-operators-rldqh\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.003299 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t9f9\" (UniqueName: \"kubernetes.io/projected/dbc19872-375c-4adc-88ca-bd7e84e57f63-kube-api-access-9t9f9\") pod \"community-operators-rldqh\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:36:38 crc kubenswrapper[4961]: E0201 19:36:38.003717 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.503696662 +0000 UTC m=+153.028728085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.033395 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7kjfx"] Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.034692 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.049368 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.081168 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7kjfx"] Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.115542 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.115916 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-utilities\") pod \"community-operators-rldqh\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.116242 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-catalog-content\") pod \"community-operators-rldqh\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.116293 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t9f9\" (UniqueName: \"kubernetes.io/projected/dbc19872-375c-4adc-88ca-bd7e84e57f63-kube-api-access-9t9f9\") pod \"community-operators-rldqh\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:36:38 crc kubenswrapper[4961]: E0201 19:36:38.117290 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.617275742 +0000 UTC m=+153.142307165 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.130764 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-utilities\") pod \"community-operators-rldqh\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.135770 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-catalog-content\") pod \"community-operators-rldqh\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.210877 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t9f9\" (UniqueName: \"kubernetes.io/projected/dbc19872-375c-4adc-88ca-bd7e84e57f63-kube-api-access-9t9f9\") pod \"community-operators-rldqh\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.217497 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwt76\" (UniqueName: \"kubernetes.io/projected/966c0385-f333-4a81-972f-9e5569c11e11-kube-api-access-nwt76\") pod \"certified-operators-7kjfx\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.217551 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-utilities\") pod \"certified-operators-7kjfx\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.217590 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.217615 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-catalog-content\") pod \"certified-operators-7kjfx\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:36:38 crc kubenswrapper[4961]: E0201 19:36:38.217906 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.71789322 +0000 UTC m=+153.242924643 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.225466 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jdhb2"] Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.226689 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.240499 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jdhb2"] Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.319621 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.319896 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-catalog-content\") pod \"certified-operators-7kjfx\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.319976 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-utilities\") pod \"community-operators-jdhb2\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.319995 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwt76\" (UniqueName: \"kubernetes.io/projected/966c0385-f333-4a81-972f-9e5569c11e11-kube-api-access-nwt76\") pod \"certified-operators-7kjfx\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.320023 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-catalog-content\") pod \"community-operators-jdhb2\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.320067 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-utilities\") pod \"certified-operators-7kjfx\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.320092 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk9nk\" (UniqueName: \"kubernetes.io/projected/50d7779f-dfee-45fc-b93e-693e9f069d37-kube-api-access-kk9nk\") pod \"community-operators-jdhb2\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:36:38 crc kubenswrapper[4961]: E0201 19:36:38.320229 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.820213695 +0000 UTC m=+153.345245118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.320695 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-catalog-content\") pod \"certified-operators-7kjfx\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.321300 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-utilities\") pod \"certified-operators-7kjfx\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.388193 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwt76\" (UniqueName: \"kubernetes.io/projected/966c0385-f333-4a81-972f-9e5569c11e11-kube-api-access-nwt76\") pod \"certified-operators-7kjfx\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.409550 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dk2b9"] Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.415347 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.429030 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.429135 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-utilities\") pod \"community-operators-jdhb2\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.429165 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-catalog-content\") pod \"community-operators-jdhb2\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.429192 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk9nk\" (UniqueName: \"kubernetes.io/projected/50d7779f-dfee-45fc-b93e-693e9f069d37-kube-api-access-kk9nk\") pod \"community-operators-jdhb2\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:36:38 crc kubenswrapper[4961]: E0201 19:36:38.429467 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:38.929438274 +0000 UTC m=+153.454469697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.429803 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-utilities\") pod \"community-operators-jdhb2\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.429919 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-catalog-content\") pod \"community-operators-jdhb2\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.440138 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dk2b9"] Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.451224 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.533331 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.533548 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr6xs\" (UniqueName: \"kubernetes.io/projected/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-kube-api-access-lr6xs\") pod \"certified-operators-dk2b9\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.533635 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-utilities\") pod \"certified-operators-dk2b9\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.533666 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-catalog-content\") pod \"certified-operators-dk2b9\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:36:38 crc kubenswrapper[4961]: E0201 19:36:38.533821 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:39.033805357 +0000 UTC m=+153.558836780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.550588 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.551793 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk9nk\" (UniqueName: \"kubernetes.io/projected/50d7779f-dfee-45fc-b93e-693e9f069d37-kube-api-access-kk9nk\") pod \"community-operators-jdhb2\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.572578 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:38 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:38 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:38 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.572629 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.606646 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.635131 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr6xs\" (UniqueName: \"kubernetes.io/projected/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-kube-api-access-lr6xs\") pod \"certified-operators-dk2b9\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.635223 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-utilities\") pod \"certified-operators-dk2b9\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.635255 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-catalog-content\") pod \"certified-operators-dk2b9\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.635313 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:38 crc kubenswrapper[4961]: E0201 19:36:38.635641 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:39.135627868 +0000 UTC m=+153.660659291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.635956 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-utilities\") pod \"certified-operators-dk2b9\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.636031 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-catalog-content\") pod \"certified-operators-dk2b9\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.659447 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr6xs\" (UniqueName: \"kubernetes.io/projected/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-kube-api-access-lr6xs\") pod \"certified-operators-dk2b9\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.678342 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.741559 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:38 crc kubenswrapper[4961]: E0201 19:36:38.742373 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:39.242356158 +0000 UTC m=+153.767387581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.748542 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.823110 4961 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.850689 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:38 crc kubenswrapper[4961]: E0201 19:36:38.851250 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:39.351234516 +0000 UTC m=+153.876265939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.957571 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:38 crc kubenswrapper[4961]: E0201 19:36:38.957949 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:39.457933144 +0000 UTC m=+153.982964567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:38 crc kubenswrapper[4961]: I0201 19:36:38.996981 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e7ef56ba-e930-4050-9544-77be701ea9fb","Type":"ContainerStarted","Data":"33577695fc9559aaf788eda6b009204a11e141fd9258fb31a24446fab4e9c422"} Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.064546 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:39 crc kubenswrapper[4961]: E0201 19:36:39.065158 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:39.565132576 +0000 UTC m=+154.090164199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.078606 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rldqh"] Feb 01 19:36:39 crc kubenswrapper[4961]: W0201 19:36:39.122796 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbc19872_375c_4adc_88ca_bd7e84e57f63.slice/crio-cb39a782bb3e03b206dac93227f7d0e35df9892f60ea87e8bc0c332390345371 WatchSource:0}: Error finding container cb39a782bb3e03b206dac93227f7d0e35df9892f60ea87e8bc0c332390345371: Status 404 returned error can't find the container with id cb39a782bb3e03b206dac93227f7d0e35df9892f60ea87e8bc0c332390345371 Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.166942 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:39 crc kubenswrapper[4961]: E0201 19:36:39.168659 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:39.668635705 +0000 UTC m=+154.193667128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.269100 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:39 crc kubenswrapper[4961]: E0201 19:36:39.269531 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-01 19:36:39.7695168 +0000 UTC m=+154.294548223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnz77" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.369516 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:39 crc kubenswrapper[4961]: E0201 19:36:39.371399 4961 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-01 19:36:39.871360112 +0000 UTC m=+154.396391525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.396871 4961 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-01T19:36:38.823142719Z","Handler":null,"Name":""} Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.408245 4961 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.408314 4961 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.437321 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7kjfx"] Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.471686 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.497514 4961 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.497563 4961 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.561771 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jdhb2"] Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.575142 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnz77\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.580280 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dk2b9"] Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.581418 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:39 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:39 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:39 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.581467 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.676868 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.720574 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.789997 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fm57v"] Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.791145 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.793279 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.805813 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fm57v"] Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.877351 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.879474 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-catalog-content\") pod \"redhat-marketplace-fm57v\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.879522 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-utilities\") pod \"redhat-marketplace-fm57v\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.879572 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p4rj\" (UniqueName: \"kubernetes.io/projected/df83dd41-a7f5-4daf-8d82-43067543488a-kube-api-access-6p4rj\") pod \"redhat-marketplace-fm57v\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.981207 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-catalog-content\") pod \"redhat-marketplace-fm57v\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.981285 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-utilities\") pod \"redhat-marketplace-fm57v\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.981333 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p4rj\" (UniqueName: \"kubernetes.io/projected/df83dd41-a7f5-4daf-8d82-43067543488a-kube-api-access-6p4rj\") pod \"redhat-marketplace-fm57v\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.982704 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-utilities\") pod \"redhat-marketplace-fm57v\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:36:39 crc kubenswrapper[4961]: I0201 19:36:39.982878 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-catalog-content\") pod \"redhat-marketplace-fm57v\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.006020 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhb2" event={"ID":"50d7779f-dfee-45fc-b93e-693e9f069d37","Type":"ContainerStarted","Data":"c305247e85e8ed5fa8bfd4de9664e531e1e5d5053b2a449c539c7259e6ba9422"} Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.006852 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk2b9" event={"ID":"98d49479-bf49-43f8-9e40-4e5f9ed0fb41","Type":"ContainerStarted","Data":"37eb68f96cf498a3de0fea20b59aac441f0bf29e7b58c2b801bdb1774dc0ebca"} Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.011949 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-48wsp" event={"ID":"9ab40180-9214-4fb8-a267-1ff710ad04e4","Type":"ContainerStarted","Data":"ae4d6860f18169991fa084266d0a94a5506680037d748969ce62a6ba50127b19"} Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.012001 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-48wsp" event={"ID":"9ab40180-9214-4fb8-a267-1ff710ad04e4","Type":"ContainerStarted","Data":"9c3572276194f5f3c305b8bb1fd49f2db735525fae6c058d50cfb93afa5e9883"} Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.015327 4961 generic.go:334] "Generic (PLEG): container finished" podID="dbc19872-375c-4adc-88ca-bd7e84e57f63" containerID="94a633e0f1897eeea94c87f2a6501e848a61fa31ddb906c27a93dc9508558e52" exitCode=0 Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.015438 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rldqh" event={"ID":"dbc19872-375c-4adc-88ca-bd7e84e57f63","Type":"ContainerDied","Data":"94a633e0f1897eeea94c87f2a6501e848a61fa31ddb906c27a93dc9508558e52"} Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.015473 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rldqh" event={"ID":"dbc19872-375c-4adc-88ca-bd7e84e57f63","Type":"ContainerStarted","Data":"cb39a782bb3e03b206dac93227f7d0e35df9892f60ea87e8bc0c332390345371"} Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.017031 4961 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.017674 4961 generic.go:334] "Generic (PLEG): container finished" podID="966c0385-f333-4a81-972f-9e5569c11e11" containerID="1d870a45dfb2187fc623e94f4e05c781f66c32e64bc644f396cffcacd0f529fa" exitCode=0 Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.017731 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjfx" event={"ID":"966c0385-f333-4a81-972f-9e5569c11e11","Type":"ContainerDied","Data":"1d870a45dfb2187fc623e94f4e05c781f66c32e64bc644f396cffcacd0f529fa"} Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.017747 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjfx" event={"ID":"966c0385-f333-4a81-972f-9e5569c11e11","Type":"ContainerStarted","Data":"1dec2a4f65bc5d2c4b54164a9aecb94d9fffe3ba166868badd650cab5876a66b"} Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.021431 4961 generic.go:334] "Generic (PLEG): container finished" podID="e7ef56ba-e930-4050-9544-77be701ea9fb" containerID="678f0b9fd52c1ade91e25f9c3bc3d7d52ee040572b0b8b6832c4c8827db0146a" exitCode=0 Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.021461 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e7ef56ba-e930-4050-9544-77be701ea9fb","Type":"ContainerDied","Data":"678f0b9fd52c1ade91e25f9c3bc3d7d52ee040572b0b8b6832c4c8827db0146a"} Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.025265 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p4rj\" (UniqueName: \"kubernetes.io/projected/df83dd41-a7f5-4daf-8d82-43067543488a-kube-api-access-6p4rj\") pod \"redhat-marketplace-fm57v\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.101560 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-48wsp" podStartSLOduration=12.101330515 podStartE2EDuration="12.101330515s" podCreationTimestamp="2026-02-01 19:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:40.035640184 +0000 UTC m=+154.560671607" watchObservedRunningTime="2026-02-01 19:36:40.101330515 +0000 UTC m=+154.626361928" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.149354 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnz77"] Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.193172 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4nz8k"] Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.194308 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.206758 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nz8k"] Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.244361 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.259760 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.289201 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-catalog-content\") pod \"redhat-marketplace-4nz8k\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.289285 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8cdd\" (UniqueName: \"kubernetes.io/projected/d3cea628-f22c-4b45-b5ba-e55e856cda1a-kube-api-access-s8cdd\") pod \"redhat-marketplace-4nz8k\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.289434 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-utilities\") pod \"redhat-marketplace-4nz8k\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.390849 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-catalog-content\") pod \"redhat-marketplace-4nz8k\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.390896 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8cdd\" (UniqueName: \"kubernetes.io/projected/d3cea628-f22c-4b45-b5ba-e55e856cda1a-kube-api-access-s8cdd\") pod \"redhat-marketplace-4nz8k\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.390946 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-utilities\") pod \"redhat-marketplace-4nz8k\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.391654 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-utilities\") pod \"redhat-marketplace-4nz8k\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.391907 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-catalog-content\") pod \"redhat-marketplace-4nz8k\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.412728 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8cdd\" (UniqueName: \"kubernetes.io/projected/d3cea628-f22c-4b45-b5ba-e55e856cda1a-kube-api-access-s8cdd\") pod \"redhat-marketplace-4nz8k\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.488174 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fm57v"] Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.515660 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.527131 4961 patch_prober.go:28] interesting pod/downloads-7954f5f757-v684w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.527214 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-v684w" podUID="a6e8e0a4-fb78-4de0-8a48-3283e2911ac0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.527727 4961 patch_prober.go:28] interesting pod/downloads-7954f5f757-v684w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.527831 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-v684w" podUID="a6e8e0a4-fb78-4de0-8a48-3283e2911ac0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.565576 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:40 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:40 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:40 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.565640 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.731729 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.742448 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-d28vq" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.756599 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nz8k"] Feb 01 19:36:40 crc kubenswrapper[4961]: W0201 19:36:40.769531 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3cea628_f22c_4b45_b5ba_e55e856cda1a.slice/crio-9127253d304141e950452e9851b7e7dc0ccfee9ebb79a9d6ad8c955bd7863ad4 WatchSource:0}: Error finding container 9127253d304141e950452e9851b7e7dc0ccfee9ebb79a9d6ad8c955bd7863ad4: Status 404 returned error can't find the container with id 9127253d304141e950452e9851b7e7dc0ccfee9ebb79a9d6ad8c955bd7863ad4 Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.985127 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.986077 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:40 crc kubenswrapper[4961]: I0201 19:36:40.995329 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jlbw2"] Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.000224 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.002310 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.002639 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlbw2"] Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.005290 4961 patch_prober.go:28] interesting pod/apiserver-76f77b778f-xppsd container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]log ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]etcd ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]poststarthook/generic-apiserver-start-informers ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]poststarthook/max-in-flight-filter ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 01 19:36:41 crc kubenswrapper[4961]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 01 19:36:41 crc kubenswrapper[4961]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]poststarthook/project.openshift.io-projectcache ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]poststarthook/openshift.io-startinformers ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 01 19:36:41 crc kubenswrapper[4961]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 01 19:36:41 crc kubenswrapper[4961]: livez check failed Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.005342 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" podUID="8ed21892-8bb6-46f1-a593-16b869fb63d3" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.033883 4961 generic.go:334] "Generic (PLEG): container finished" podID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" containerID="c07a406837f821ca868f967385f43d2f94d998c8f2582ed9c981759f9e0fa52a" exitCode=0 Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.034407 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nz8k" event={"ID":"d3cea628-f22c-4b45-b5ba-e55e856cda1a","Type":"ContainerDied","Data":"c07a406837f821ca868f967385f43d2f94d998c8f2582ed9c981759f9e0fa52a"} Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.034464 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nz8k" event={"ID":"d3cea628-f22c-4b45-b5ba-e55e856cda1a","Type":"ContainerStarted","Data":"9127253d304141e950452e9851b7e7dc0ccfee9ebb79a9d6ad8c955bd7863ad4"} Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.041706 4961 generic.go:334] "Generic (PLEG): container finished" podID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" containerID="6e38e41a77a05699411e4c26572e86fb95d67f845947f52340555d6d5c8ba466" exitCode=0 Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.041804 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk2b9" event={"ID":"98d49479-bf49-43f8-9e40-4e5f9ed0fb41","Type":"ContainerDied","Data":"6e38e41a77a05699411e4c26572e86fb95d67f845947f52340555d6d5c8ba466"} Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.049461 4961 generic.go:334] "Generic (PLEG): container finished" podID="df83dd41-a7f5-4daf-8d82-43067543488a" containerID="c5147254c087559ef246c0f114896e355ed74aefc9d3055bbddc46b3656a6231" exitCode=0 Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.049528 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fm57v" event={"ID":"df83dd41-a7f5-4daf-8d82-43067543488a","Type":"ContainerDied","Data":"c5147254c087559ef246c0f114896e355ed74aefc9d3055bbddc46b3656a6231"} Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.049557 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fm57v" event={"ID":"df83dd41-a7f5-4daf-8d82-43067543488a","Type":"ContainerStarted","Data":"82b0811975bf4c358981263bb6f72cbc5dea142daa4e1d57ba9dd0ff9e9b4d9b"} Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.053742 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" event={"ID":"5723f9af-aa2e-4113-9e47-90f01ea33631","Type":"ContainerStarted","Data":"c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539"} Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.053794 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" event={"ID":"5723f9af-aa2e-4113-9e47-90f01ea33631","Type":"ContainerStarted","Data":"7c85fe4d697e3451ebd2586e24f5b0baaff324ee8b5b3d137e9aa42417819323"} Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.054624 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.059226 4961 generic.go:334] "Generic (PLEG): container finished" podID="50d7779f-dfee-45fc-b93e-693e9f069d37" containerID="a24bab076287754c9c9bf4c931775ba6f5c99a71276332468b5a6287b20af352" exitCode=0 Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.060094 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhb2" event={"ID":"50d7779f-dfee-45fc-b93e-693e9f069d37","Type":"ContainerDied","Data":"a24bab076287754c9c9bf4c931775ba6f5c99a71276332468b5a6287b20af352"} Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.105466 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pzpj\" (UniqueName: \"kubernetes.io/projected/57b71857-2f5b-42af-be87-47936a224fe0-kube-api-access-9pzpj\") pod \"redhat-operators-jlbw2\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.105641 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-catalog-content\") pod \"redhat-operators-jlbw2\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.105692 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-utilities\") pod \"redhat-operators-jlbw2\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.148394 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.152801 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.156674 4961 patch_prober.go:28] interesting pod/console-f9d7485db-dw6q4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.156740 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dw6q4" podUID="01c9f61a-85b0-499a-9885-1a02903ef457" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.161941 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" podStartSLOduration=134.161897534 podStartE2EDuration="2m14.161897534s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:41.145282399 +0000 UTC m=+155.670313822" watchObservedRunningTime="2026-02-01 19:36:41.161897534 +0000 UTC m=+155.686928957" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.208772 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pzpj\" (UniqueName: \"kubernetes.io/projected/57b71857-2f5b-42af-be87-47936a224fe0-kube-api-access-9pzpj\") pod \"redhat-operators-jlbw2\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.209027 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-catalog-content\") pod \"redhat-operators-jlbw2\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.209164 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-utilities\") pod \"redhat-operators-jlbw2\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.215592 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-catalog-content\") pod \"redhat-operators-jlbw2\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.218184 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-utilities\") pod \"redhat-operators-jlbw2\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.242288 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pzpj\" (UniqueName: \"kubernetes.io/projected/57b71857-2f5b-42af-be87-47936a224fe0-kube-api-access-9pzpj\") pod \"redhat-operators-jlbw2\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.324416 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.339362 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.398676 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wbcn9"] Feb 01 19:36:41 crc kubenswrapper[4961]: E0201 19:36:41.398893 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ef56ba-e930-4050-9544-77be701ea9fb" containerName="pruner" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.398907 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ef56ba-e930-4050-9544-77be701ea9fb" containerName="pruner" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.399003 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ef56ba-e930-4050-9544-77be701ea9fb" containerName="pruner" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.399766 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.406136 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wbcn9"] Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.418088 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7ef56ba-e930-4050-9544-77be701ea9fb-kube-api-access\") pod \"e7ef56ba-e930-4050-9544-77be701ea9fb\" (UID: \"e7ef56ba-e930-4050-9544-77be701ea9fb\") " Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.418237 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7ef56ba-e930-4050-9544-77be701ea9fb-kubelet-dir\") pod \"e7ef56ba-e930-4050-9544-77be701ea9fb\" (UID: \"e7ef56ba-e930-4050-9544-77be701ea9fb\") " Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.418479 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7ef56ba-e930-4050-9544-77be701ea9fb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e7ef56ba-e930-4050-9544-77be701ea9fb" (UID: "e7ef56ba-e930-4050-9544-77be701ea9fb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.426108 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ef56ba-e930-4050-9544-77be701ea9fb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7ef56ba-e930-4050-9544-77be701ea9fb" (UID: "e7ef56ba-e930-4050-9544-77be701ea9fb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.519176 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-utilities\") pod \"redhat-operators-wbcn9\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.519386 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98cpv\" (UniqueName: \"kubernetes.io/projected/100501b4-f817-42f7-951e-cbfd5f111f35-kube-api-access-98cpv\") pod \"redhat-operators-wbcn9\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.519473 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-catalog-content\") pod \"redhat-operators-wbcn9\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.519626 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7ef56ba-e930-4050-9544-77be701ea9fb-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.519639 4961 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7ef56ba-e930-4050-9544-77be701ea9fb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.560825 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.565579 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:41 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:41 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:41 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.565646 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.621244 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-utilities\") pod \"redhat-operators-wbcn9\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.623956 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-utilities\") pod \"redhat-operators-wbcn9\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.624009 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98cpv\" (UniqueName: \"kubernetes.io/projected/100501b4-f817-42f7-951e-cbfd5f111f35-kube-api-access-98cpv\") pod \"redhat-operators-wbcn9\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.624090 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-catalog-content\") pod \"redhat-operators-wbcn9\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.624651 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-catalog-content\") pod \"redhat-operators-wbcn9\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.659228 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98cpv\" (UniqueName: \"kubernetes.io/projected/100501b4-f817-42f7-951e-cbfd5f111f35-kube-api-access-98cpv\") pod \"redhat-operators-wbcn9\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.726903 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.805377 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlbw2"] Feb 01 19:36:41 crc kubenswrapper[4961]: I0201 19:36:41.900356 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:36:42 crc kubenswrapper[4961]: I0201 19:36:42.087908 4961 generic.go:334] "Generic (PLEG): container finished" podID="be03c00f-3bf0-4698-afe4-201249fe36a7" containerID="2713931c99e90e8105f56013d5bb9a64813e5dac535850cda522a005272ba78a" exitCode=0 Feb 01 19:36:42 crc kubenswrapper[4961]: I0201 19:36:42.087993 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" event={"ID":"be03c00f-3bf0-4698-afe4-201249fe36a7","Type":"ContainerDied","Data":"2713931c99e90e8105f56013d5bb9a64813e5dac535850cda522a005272ba78a"} Feb 01 19:36:42 crc kubenswrapper[4961]: I0201 19:36:42.090561 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e7ef56ba-e930-4050-9544-77be701ea9fb","Type":"ContainerDied","Data":"33577695fc9559aaf788eda6b009204a11e141fd9258fb31a24446fab4e9c422"} Feb 01 19:36:42 crc kubenswrapper[4961]: I0201 19:36:42.090587 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33577695fc9559aaf788eda6b009204a11e141fd9258fb31a24446fab4e9c422" Feb 01 19:36:42 crc kubenswrapper[4961]: I0201 19:36:42.090628 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 01 19:36:42 crc kubenswrapper[4961]: I0201 19:36:42.113932 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbw2" event={"ID":"57b71857-2f5b-42af-be87-47936a224fe0","Type":"ContainerStarted","Data":"4227206e4a1b5a68854550b375c1909eb131e9364f2fe7b4e6b74bd8570fec0f"} Feb 01 19:36:42 crc kubenswrapper[4961]: I0201 19:36:42.247915 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wbcn9"] Feb 01 19:36:42 crc kubenswrapper[4961]: W0201 19:36:42.279910 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod100501b4_f817_42f7_951e_cbfd5f111f35.slice/crio-e278a120b296516a9deb658c5d2b84803e284bdb0f9ec13ef67bf54a26892286 WatchSource:0}: Error finding container e278a120b296516a9deb658c5d2b84803e284bdb0f9ec13ef67bf54a26892286: Status 404 returned error can't find the container with id e278a120b296516a9deb658c5d2b84803e284bdb0f9ec13ef67bf54a26892286 Feb 01 19:36:42 crc kubenswrapper[4961]: I0201 19:36:42.564916 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:42 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:42 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:42 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:42 crc kubenswrapper[4961]: I0201 19:36:42.564970 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.126425 4961 generic.go:334] "Generic (PLEG): container finished" podID="57b71857-2f5b-42af-be87-47936a224fe0" containerID="65473cb7747e3058137bc527ccafa0c8d032baaf3c5ba64d354bfd639270ee96" exitCode=0 Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.126525 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbw2" event={"ID":"57b71857-2f5b-42af-be87-47936a224fe0","Type":"ContainerDied","Data":"65473cb7747e3058137bc527ccafa0c8d032baaf3c5ba64d354bfd639270ee96"} Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.128364 4961 generic.go:334] "Generic (PLEG): container finished" podID="100501b4-f817-42f7-951e-cbfd5f111f35" containerID="b0170ba4f9939e77bd96ca4261dea8a5bfa4f3ddd3feedb6a3dd7411e670d4cd" exitCode=0 Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.128479 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbcn9" event={"ID":"100501b4-f817-42f7-951e-cbfd5f111f35","Type":"ContainerDied","Data":"b0170ba4f9939e77bd96ca4261dea8a5bfa4f3ddd3feedb6a3dd7411e670d4cd"} Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.128610 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbcn9" event={"ID":"100501b4-f817-42f7-951e-cbfd5f111f35","Type":"ContainerStarted","Data":"e278a120b296516a9deb658c5d2b84803e284bdb0f9ec13ef67bf54a26892286"} Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.392750 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.478798 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be03c00f-3bf0-4698-afe4-201249fe36a7-config-volume\") pod \"be03c00f-3bf0-4698-afe4-201249fe36a7\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.478949 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be03c00f-3bf0-4698-afe4-201249fe36a7-secret-volume\") pod \"be03c00f-3bf0-4698-afe4-201249fe36a7\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.478979 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7ksr\" (UniqueName: \"kubernetes.io/projected/be03c00f-3bf0-4698-afe4-201249fe36a7-kube-api-access-h7ksr\") pod \"be03c00f-3bf0-4698-afe4-201249fe36a7\" (UID: \"be03c00f-3bf0-4698-afe4-201249fe36a7\") " Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.479941 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be03c00f-3bf0-4698-afe4-201249fe36a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "be03c00f-3bf0-4698-afe4-201249fe36a7" (UID: "be03c00f-3bf0-4698-afe4-201249fe36a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.486448 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be03c00f-3bf0-4698-afe4-201249fe36a7-kube-api-access-h7ksr" (OuterVolumeSpecName: "kube-api-access-h7ksr") pod "be03c00f-3bf0-4698-afe4-201249fe36a7" (UID: "be03c00f-3bf0-4698-afe4-201249fe36a7"). InnerVolumeSpecName "kube-api-access-h7ksr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.500537 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be03c00f-3bf0-4698-afe4-201249fe36a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "be03c00f-3bf0-4698-afe4-201249fe36a7" (UID: "be03c00f-3bf0-4698-afe4-201249fe36a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.564363 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:43 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:43 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:43 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.564436 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.580764 4961 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be03c00f-3bf0-4698-afe4-201249fe36a7-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.580795 4961 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be03c00f-3bf0-4698-afe4-201249fe36a7-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.580806 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7ksr\" (UniqueName: \"kubernetes.io/projected/be03c00f-3bf0-4698-afe4-201249fe36a7-kube-api-access-h7ksr\") on node \"crc\" DevicePath \"\"" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.818870 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 01 19:36:43 crc kubenswrapper[4961]: E0201 19:36:43.819149 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be03c00f-3bf0-4698-afe4-201249fe36a7" containerName="collect-profiles" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.819163 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="be03c00f-3bf0-4698-afe4-201249fe36a7" containerName="collect-profiles" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.819261 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="be03c00f-3bf0-4698-afe4-201249fe36a7" containerName="collect-profiles" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.819658 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.826875 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.827442 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.842601 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.986242 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4554aaf1-73c7-4673-a03e-e3b9d9588aed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 19:36:43 crc kubenswrapper[4961]: I0201 19:36:43.986306 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4554aaf1-73c7-4673-a03e-e3b9d9588aed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 19:36:44 crc kubenswrapper[4961]: I0201 19:36:44.087236 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4554aaf1-73c7-4673-a03e-e3b9d9588aed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 19:36:44 crc kubenswrapper[4961]: I0201 19:36:44.087309 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4554aaf1-73c7-4673-a03e-e3b9d9588aed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 19:36:44 crc kubenswrapper[4961]: I0201 19:36:44.087377 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4554aaf1-73c7-4673-a03e-e3b9d9588aed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 19:36:44 crc kubenswrapper[4961]: I0201 19:36:44.113953 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4554aaf1-73c7-4673-a03e-e3b9d9588aed\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 19:36:44 crc kubenswrapper[4961]: I0201 19:36:44.143175 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 19:36:44 crc kubenswrapper[4961]: I0201 19:36:44.190075 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" event={"ID":"be03c00f-3bf0-4698-afe4-201249fe36a7","Type":"ContainerDied","Data":"e1ef06f5d09589a522250a92d1173810a7aaaf8b3ddb551c23709c5db3e9c150"} Feb 01 19:36:44 crc kubenswrapper[4961]: I0201 19:36:44.190136 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ef06f5d09589a522250a92d1173810a7aaaf8b3ddb551c23709c5db3e9c150" Feb 01 19:36:44 crc kubenswrapper[4961]: I0201 19:36:44.190386 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:36:44 crc kubenswrapper[4961]: I0201 19:36:44.564195 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:44 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:44 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:44 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:44 crc kubenswrapper[4961]: I0201 19:36:44.564251 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:44 crc kubenswrapper[4961]: I0201 19:36:44.645373 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 01 19:36:45 crc kubenswrapper[4961]: I0201 19:36:45.260496 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4554aaf1-73c7-4673-a03e-e3b9d9588aed","Type":"ContainerStarted","Data":"f3ecf8c3480d5239b52c6948de30848c9364cea488f9ffd6cde6d9f6e8c794df"} Feb 01 19:36:45 crc kubenswrapper[4961]: I0201 19:36:45.565091 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:45 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:45 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:45 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:45 crc kubenswrapper[4961]: I0201 19:36:45.565158 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:45 crc kubenswrapper[4961]: I0201 19:36:45.991517 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:45 crc kubenswrapper[4961]: I0201 19:36:45.998402 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-xppsd" Feb 01 19:36:46 crc kubenswrapper[4961]: I0201 19:36:46.314190 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4554aaf1-73c7-4673-a03e-e3b9d9588aed","Type":"ContainerStarted","Data":"5e312915811c108d9617317cbc55232343ff9e03a7410875b9336ce906e990a3"} Feb 01 19:36:46 crc kubenswrapper[4961]: I0201 19:36:46.579186 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:46 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:46 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:46 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:46 crc kubenswrapper[4961]: I0201 19:36:46.579235 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:46 crc kubenswrapper[4961]: I0201 19:36:46.736344 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wgjjb" Feb 01 19:36:46 crc kubenswrapper[4961]: I0201 19:36:46.772482 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.772465884 podStartE2EDuration="3.772465884s" podCreationTimestamp="2026-02-01 19:36:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:36:46.332460273 +0000 UTC m=+160.857491696" watchObservedRunningTime="2026-02-01 19:36:46.772465884 +0000 UTC m=+161.297497317" Feb 01 19:36:47 crc kubenswrapper[4961]: I0201 19:36:47.321165 4961 generic.go:334] "Generic (PLEG): container finished" podID="4554aaf1-73c7-4673-a03e-e3b9d9588aed" containerID="5e312915811c108d9617317cbc55232343ff9e03a7410875b9336ce906e990a3" exitCode=0 Feb 01 19:36:47 crc kubenswrapper[4961]: I0201 19:36:47.321221 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4554aaf1-73c7-4673-a03e-e3b9d9588aed","Type":"ContainerDied","Data":"5e312915811c108d9617317cbc55232343ff9e03a7410875b9336ce906e990a3"} Feb 01 19:36:47 crc kubenswrapper[4961]: I0201 19:36:47.564952 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:47 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:47 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:47 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:47 crc kubenswrapper[4961]: I0201 19:36:47.565494 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:48 crc kubenswrapper[4961]: I0201 19:36:48.564724 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:48 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:48 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:48 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:48 crc kubenswrapper[4961]: I0201 19:36:48.564817 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:49 crc kubenswrapper[4961]: I0201 19:36:49.564534 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:49 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:49 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:49 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:49 crc kubenswrapper[4961]: I0201 19:36:49.564986 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:49 crc kubenswrapper[4961]: I0201 19:36:49.797668 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:49 crc kubenswrapper[4961]: I0201 19:36:49.820931 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b35577aa-3e17-4907-80d9-b8911ea55c25-metrics-certs\") pod \"network-metrics-daemon-j6fs5\" (UID: \"b35577aa-3e17-4907-80d9-b8911ea55c25\") " pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:50 crc kubenswrapper[4961]: I0201 19:36:50.047933 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j6fs5" Feb 01 19:36:50 crc kubenswrapper[4961]: I0201 19:36:50.282024 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:36:50 crc kubenswrapper[4961]: I0201 19:36:50.282107 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:36:50 crc kubenswrapper[4961]: I0201 19:36:50.544214 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-v684w" Feb 01 19:36:50 crc kubenswrapper[4961]: I0201 19:36:50.565771 4961 patch_prober.go:28] interesting pod/router-default-5444994796-tjxcx container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 01 19:36:50 crc kubenswrapper[4961]: [-]has-synced failed: reason withheld Feb 01 19:36:50 crc kubenswrapper[4961]: [+]process-running ok Feb 01 19:36:50 crc kubenswrapper[4961]: healthz check failed Feb 01 19:36:50 crc kubenswrapper[4961]: I0201 19:36:50.565823 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-tjxcx" podUID="da8e51d5-7a7d-436d-ab17-ac15406f4e03" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 01 19:36:51 crc kubenswrapper[4961]: I0201 19:36:51.149263 4961 patch_prober.go:28] interesting pod/console-f9d7485db-dw6q4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 01 19:36:51 crc kubenswrapper[4961]: I0201 19:36:51.149319 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dw6q4" podUID="01c9f61a-85b0-499a-9885-1a02903ef457" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 01 19:36:51 crc kubenswrapper[4961]: I0201 19:36:51.565141 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:51 crc kubenswrapper[4961]: I0201 19:36:51.568787 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-tjxcx" Feb 01 19:36:53 crc kubenswrapper[4961]: I0201 19:36:53.046207 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 19:36:53 crc kubenswrapper[4961]: I0201 19:36:53.145302 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kube-api-access\") pod \"4554aaf1-73c7-4673-a03e-e3b9d9588aed\" (UID: \"4554aaf1-73c7-4673-a03e-e3b9d9588aed\") " Feb 01 19:36:53 crc kubenswrapper[4961]: I0201 19:36:53.145447 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kubelet-dir\") pod \"4554aaf1-73c7-4673-a03e-e3b9d9588aed\" (UID: \"4554aaf1-73c7-4673-a03e-e3b9d9588aed\") " Feb 01 19:36:53 crc kubenswrapper[4961]: I0201 19:36:53.145793 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4554aaf1-73c7-4673-a03e-e3b9d9588aed" (UID: "4554aaf1-73c7-4673-a03e-e3b9d9588aed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:36:53 crc kubenswrapper[4961]: I0201 19:36:53.152258 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4554aaf1-73c7-4673-a03e-e3b9d9588aed" (UID: "4554aaf1-73c7-4673-a03e-e3b9d9588aed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:36:53 crc kubenswrapper[4961]: I0201 19:36:53.247950 4961 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 19:36:53 crc kubenswrapper[4961]: I0201 19:36:53.248093 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4554aaf1-73c7-4673-a03e-e3b9d9588aed-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 19:36:53 crc kubenswrapper[4961]: I0201 19:36:53.383664 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4554aaf1-73c7-4673-a03e-e3b9d9588aed","Type":"ContainerDied","Data":"f3ecf8c3480d5239b52c6948de30848c9364cea488f9ffd6cde6d9f6e8c794df"} Feb 01 19:36:53 crc kubenswrapper[4961]: I0201 19:36:53.383710 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3ecf8c3480d5239b52c6948de30848c9364cea488f9ffd6cde6d9f6e8c794df" Feb 01 19:36:53 crc kubenswrapper[4961]: I0201 19:36:53.383757 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 01 19:36:59 crc kubenswrapper[4961]: I0201 19:36:59.884083 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:37:01 crc kubenswrapper[4961]: I0201 19:37:01.159539 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:37:01 crc kubenswrapper[4961]: I0201 19:37:01.162752 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-dw6q4" Feb 01 19:37:06 crc kubenswrapper[4961]: E0201 19:37:06.956829 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 01 19:37:06 crc kubenswrapper[4961]: E0201 19:37:06.957354 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9t9f9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rldqh_openshift-marketplace(dbc19872-375c-4adc-88ca-bd7e84e57f63): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 19:37:06 crc kubenswrapper[4961]: E0201 19:37:06.958637 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rldqh" podUID="dbc19872-375c-4adc-88ca-bd7e84e57f63" Feb 01 19:37:09 crc kubenswrapper[4961]: E0201 19:37:09.868113 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rldqh" podUID="dbc19872-375c-4adc-88ca-bd7e84e57f63" Feb 01 19:37:09 crc kubenswrapper[4961]: E0201 19:37:09.924916 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 01 19:37:09 crc kubenswrapper[4961]: E0201 19:37:09.925432 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s8cdd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4nz8k_openshift-marketplace(d3cea628-f22c-4b45-b5ba-e55e856cda1a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 19:37:09 crc kubenswrapper[4961]: E0201 19:37:09.926604 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4nz8k" podUID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.101137 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4nz8k" podUID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.211706 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.212320 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nwt76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7kjfx_openshift-marketplace(966c0385-f333-4a81-972f-9e5569c11e11): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.213416 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7kjfx" podUID="966c0385-f333-4a81-972f-9e5569c11e11" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.230324 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.230457 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kk9nk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-jdhb2_openshift-marketplace(50d7779f-dfee-45fc-b93e-693e9f069d37): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.231641 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-jdhb2" podUID="50d7779f-dfee-45fc-b93e-693e9f069d37" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.257425 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.257786 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9pzpj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jlbw2_openshift-marketplace(57b71857-2f5b-42af-be87-47936a224fe0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.259738 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jlbw2" podUID="57b71857-2f5b-42af-be87-47936a224fe0" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.261610 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.261773 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lr6xs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dk2b9_openshift-marketplace(98d49479-bf49-43f8-9e40-4e5f9ed0fb41): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.266515 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dk2b9" podUID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.272796 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.273254 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6p4rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fm57v_openshift-marketplace(df83dd41-a7f5-4daf-8d82-43067543488a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.274484 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fm57v" podUID="df83dd41-a7f5-4daf-8d82-43067543488a" Feb 01 19:37:11 crc kubenswrapper[4961]: I0201 19:37:11.494615 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbcn9" event={"ID":"100501b4-f817-42f7-951e-cbfd5f111f35","Type":"ContainerStarted","Data":"68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf"} Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.497781 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jlbw2" podUID="57b71857-2f5b-42af-be87-47936a224fe0" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.498618 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fm57v" podUID="df83dd41-a7f5-4daf-8d82-43067543488a" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.498618 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dk2b9" podUID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.498631 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7kjfx" podUID="966c0385-f333-4a81-972f-9e5569c11e11" Feb 01 19:37:11 crc kubenswrapper[4961]: E0201 19:37:11.498689 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-jdhb2" podUID="50d7779f-dfee-45fc-b93e-693e9f069d37" Feb 01 19:37:11 crc kubenswrapper[4961]: I0201 19:37:11.589351 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j6fs5"] Feb 01 19:37:11 crc kubenswrapper[4961]: I0201 19:37:11.945366 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s6pw4" Feb 01 19:37:12 crc kubenswrapper[4961]: I0201 19:37:12.503299 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" event={"ID":"b35577aa-3e17-4907-80d9-b8911ea55c25","Type":"ContainerStarted","Data":"64dfd24f90cdf5f4ca9b234387af6a7732f9400844baaf664b731e865e1480fb"} Feb 01 19:37:12 crc kubenswrapper[4961]: I0201 19:37:12.503684 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" event={"ID":"b35577aa-3e17-4907-80d9-b8911ea55c25","Type":"ContainerStarted","Data":"daa592fb69ec856890be1977e82a1e1f378604dc6dc1dbb76c831e335ac1ced4"} Feb 01 19:37:12 crc kubenswrapper[4961]: I0201 19:37:12.503718 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j6fs5" event={"ID":"b35577aa-3e17-4907-80d9-b8911ea55c25","Type":"ContainerStarted","Data":"7ecf2a611d84909e8672dad3bcd6d0f8a153efa7b4087d35c8ef56b9b641c440"} Feb 01 19:37:12 crc kubenswrapper[4961]: I0201 19:37:12.506292 4961 generic.go:334] "Generic (PLEG): container finished" podID="100501b4-f817-42f7-951e-cbfd5f111f35" containerID="68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf" exitCode=0 Feb 01 19:37:12 crc kubenswrapper[4961]: I0201 19:37:12.506334 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbcn9" event={"ID":"100501b4-f817-42f7-951e-cbfd5f111f35","Type":"ContainerDied","Data":"68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf"} Feb 01 19:37:12 crc kubenswrapper[4961]: I0201 19:37:12.539343 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j6fs5" podStartSLOduration=165.539300625 podStartE2EDuration="2m45.539300625s" podCreationTimestamp="2026-02-01 19:34:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:37:12.520598651 +0000 UTC m=+187.045630154" watchObservedRunningTime="2026-02-01 19:37:12.539300625 +0000 UTC m=+187.064332058" Feb 01 19:37:13 crc kubenswrapper[4961]: I0201 19:37:13.515437 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbcn9" event={"ID":"100501b4-f817-42f7-951e-cbfd5f111f35","Type":"ContainerStarted","Data":"0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb"} Feb 01 19:37:13 crc kubenswrapper[4961]: I0201 19:37:13.539544 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wbcn9" podStartSLOduration=2.766492494 podStartE2EDuration="32.539519456s" podCreationTimestamp="2026-02-01 19:36:41 +0000 UTC" firstStartedPulling="2026-02-01 19:36:43.131489161 +0000 UTC m=+157.656520584" lastFinishedPulling="2026-02-01 19:37:12.904516133 +0000 UTC m=+187.429547546" observedRunningTime="2026-02-01 19:37:13.535617927 +0000 UTC m=+188.060649390" watchObservedRunningTime="2026-02-01 19:37:13.539519456 +0000 UTC m=+188.064550919" Feb 01 19:37:14 crc kubenswrapper[4961]: I0201 19:37:14.281413 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 01 19:37:14 crc kubenswrapper[4961]: I0201 19:37:14.287295 4961 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podbe03c00f-3bf0-4698-afe4-201249fe36a7"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podbe03c00f-3bf0-4698-afe4-201249fe36a7] : Timed out while waiting for systemd to remove kubepods-burstable-podbe03c00f_3bf0_4698_afe4_201249fe36a7.slice" Feb 01 19:37:14 crc kubenswrapper[4961]: E0201 19:37:14.287374 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable podbe03c00f-3bf0-4698-afe4-201249fe36a7] : unable to destroy cgroup paths for cgroup [kubepods burstable podbe03c00f-3bf0-4698-afe4-201249fe36a7] : Timed out while waiting for systemd to remove kubepods-burstable-podbe03c00f_3bf0_4698_afe4_201249fe36a7.slice" pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" podUID="be03c00f-3bf0-4698-afe4-201249fe36a7" Feb 01 19:37:14 crc kubenswrapper[4961]: I0201 19:37:14.519228 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499570-lnpdj" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.814168 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 01 19:37:18 crc kubenswrapper[4961]: E0201 19:37:18.815022 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4554aaf1-73c7-4673-a03e-e3b9d9588aed" containerName="pruner" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.815260 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="4554aaf1-73c7-4673-a03e-e3b9d9588aed" containerName="pruner" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.815399 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="4554aaf1-73c7-4673-a03e-e3b9d9588aed" containerName="pruner" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.815865 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.819775 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.820062 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.861503 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.861573 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.863980 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.962978 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.963089 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.963448 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 19:37:18 crc kubenswrapper[4961]: I0201 19:37:18.989586 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 19:37:19 crc kubenswrapper[4961]: I0201 19:37:19.172956 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 19:37:19 crc kubenswrapper[4961]: I0201 19:37:19.422748 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-597bm"] Feb 01 19:37:19 crc kubenswrapper[4961]: I0201 19:37:19.581181 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 01 19:37:19 crc kubenswrapper[4961]: W0201 19:37:19.598300 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3567c5a9_ebc9_4eea_81cc_96953bc4ef4a.slice/crio-1dabcfd133ae60a4527a693d2562c6511a73028bb8ba0abc7f36cfd52f46e59f WatchSource:0}: Error finding container 1dabcfd133ae60a4527a693d2562c6511a73028bb8ba0abc7f36cfd52f46e59f: Status 404 returned error can't find the container with id 1dabcfd133ae60a4527a693d2562c6511a73028bb8ba0abc7f36cfd52f46e59f Feb 01 19:37:20 crc kubenswrapper[4961]: I0201 19:37:20.281810 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:37:20 crc kubenswrapper[4961]: I0201 19:37:20.282260 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:37:20 crc kubenswrapper[4961]: I0201 19:37:20.552622 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a","Type":"ContainerStarted","Data":"72f5f3ec04b7c37f451be0d4928e8b98c4158f1ab74c76962100440da518122f"} Feb 01 19:37:20 crc kubenswrapper[4961]: I0201 19:37:20.552710 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a","Type":"ContainerStarted","Data":"1dabcfd133ae60a4527a693d2562c6511a73028bb8ba0abc7f36cfd52f46e59f"} Feb 01 19:37:21 crc kubenswrapper[4961]: I0201 19:37:21.576763 4961 generic.go:334] "Generic (PLEG): container finished" podID="3567c5a9-ebc9-4eea-81cc-96953bc4ef4a" containerID="72f5f3ec04b7c37f451be0d4928e8b98c4158f1ab74c76962100440da518122f" exitCode=0 Feb 01 19:37:21 crc kubenswrapper[4961]: I0201 19:37:21.577196 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a","Type":"ContainerDied","Data":"72f5f3ec04b7c37f451be0d4928e8b98c4158f1ab74c76962100440da518122f"} Feb 01 19:37:21 crc kubenswrapper[4961]: I0201 19:37:21.727485 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:37:21 crc kubenswrapper[4961]: I0201 19:37:21.727608 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:37:21 crc kubenswrapper[4961]: I0201 19:37:21.873747 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:37:22 crc kubenswrapper[4961]: I0201 19:37:22.639157 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:37:23 crc kubenswrapper[4961]: I0201 19:37:23.013176 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 19:37:23 crc kubenswrapper[4961]: I0201 19:37:23.035562 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kube-api-access\") pod \"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a\" (UID: \"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a\") " Feb 01 19:37:23 crc kubenswrapper[4961]: I0201 19:37:23.035735 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kubelet-dir\") pod \"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a\" (UID: \"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a\") " Feb 01 19:37:23 crc kubenswrapper[4961]: I0201 19:37:23.036363 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3567c5a9-ebc9-4eea-81cc-96953bc4ef4a" (UID: "3567c5a9-ebc9-4eea-81cc-96953bc4ef4a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:37:23 crc kubenswrapper[4961]: I0201 19:37:23.050191 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3567c5a9-ebc9-4eea-81cc-96953bc4ef4a" (UID: "3567c5a9-ebc9-4eea-81cc-96953bc4ef4a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:37:23 crc kubenswrapper[4961]: I0201 19:37:23.114706 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wbcn9"] Feb 01 19:37:23 crc kubenswrapper[4961]: I0201 19:37:23.137813 4961 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:23 crc kubenswrapper[4961]: I0201 19:37:23.137865 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3567c5a9-ebc9-4eea-81cc-96953bc4ef4a-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:23 crc kubenswrapper[4961]: I0201 19:37:23.595743 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3567c5a9-ebc9-4eea-81cc-96953bc4ef4a","Type":"ContainerDied","Data":"1dabcfd133ae60a4527a693d2562c6511a73028bb8ba0abc7f36cfd52f46e59f"} Feb 01 19:37:23 crc kubenswrapper[4961]: I0201 19:37:23.595821 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dabcfd133ae60a4527a693d2562c6511a73028bb8ba0abc7f36cfd52f46e59f" Feb 01 19:37:23 crc kubenswrapper[4961]: I0201 19:37:23.595770 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 01 19:37:24 crc kubenswrapper[4961]: I0201 19:37:24.605058 4961 generic.go:334] "Generic (PLEG): container finished" podID="dbc19872-375c-4adc-88ca-bd7e84e57f63" containerID="c602277d1088207d492427574abde8e58196d339d44e3fdf94775562232447b7" exitCode=0 Feb 01 19:37:24 crc kubenswrapper[4961]: I0201 19:37:24.605516 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rldqh" event={"ID":"dbc19872-375c-4adc-88ca-bd7e84e57f63","Type":"ContainerDied","Data":"c602277d1088207d492427574abde8e58196d339d44e3fdf94775562232447b7"} Feb 01 19:37:24 crc kubenswrapper[4961]: I0201 19:37:24.608065 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wbcn9" podUID="100501b4-f817-42f7-951e-cbfd5f111f35" containerName="registry-server" containerID="cri-o://0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb" gracePeriod=2 Feb 01 19:37:24 crc kubenswrapper[4961]: I0201 19:37:24.608219 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjfx" event={"ID":"966c0385-f333-4a81-972f-9e5569c11e11","Type":"ContainerStarted","Data":"5047e21e0dec0312120fb20c481363fc8c98a0cab2327d5ec2817a1df158aed1"} Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.024571 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.066957 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-catalog-content\") pod \"100501b4-f817-42f7-951e-cbfd5f111f35\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.067120 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-utilities\") pod \"100501b4-f817-42f7-951e-cbfd5f111f35\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.067204 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98cpv\" (UniqueName: \"kubernetes.io/projected/100501b4-f817-42f7-951e-cbfd5f111f35-kube-api-access-98cpv\") pod \"100501b4-f817-42f7-951e-cbfd5f111f35\" (UID: \"100501b4-f817-42f7-951e-cbfd5f111f35\") " Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.068102 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-utilities" (OuterVolumeSpecName: "utilities") pod "100501b4-f817-42f7-951e-cbfd5f111f35" (UID: "100501b4-f817-42f7-951e-cbfd5f111f35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.074925 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100501b4-f817-42f7-951e-cbfd5f111f35-kube-api-access-98cpv" (OuterVolumeSpecName: "kube-api-access-98cpv") pod "100501b4-f817-42f7-951e-cbfd5f111f35" (UID: "100501b4-f817-42f7-951e-cbfd5f111f35"). InnerVolumeSpecName "kube-api-access-98cpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.168630 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.168664 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98cpv\" (UniqueName: \"kubernetes.io/projected/100501b4-f817-42f7-951e-cbfd5f111f35-kube-api-access-98cpv\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.220612 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "100501b4-f817-42f7-951e-cbfd5f111f35" (UID: "100501b4-f817-42f7-951e-cbfd5f111f35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.275403 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/100501b4-f817-42f7-951e-cbfd5f111f35-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.622794 4961 generic.go:334] "Generic (PLEG): container finished" podID="966c0385-f333-4a81-972f-9e5569c11e11" containerID="5047e21e0dec0312120fb20c481363fc8c98a0cab2327d5ec2817a1df158aed1" exitCode=0 Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.622880 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjfx" event={"ID":"966c0385-f333-4a81-972f-9e5569c11e11","Type":"ContainerDied","Data":"5047e21e0dec0312120fb20c481363fc8c98a0cab2327d5ec2817a1df158aed1"} Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.628939 4961 generic.go:334] "Generic (PLEG): container finished" podID="100501b4-f817-42f7-951e-cbfd5f111f35" containerID="0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb" exitCode=0 Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.629026 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wbcn9" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.629104 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbcn9" event={"ID":"100501b4-f817-42f7-951e-cbfd5f111f35","Type":"ContainerDied","Data":"0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb"} Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.629211 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wbcn9" event={"ID":"100501b4-f817-42f7-951e-cbfd5f111f35","Type":"ContainerDied","Data":"e278a120b296516a9deb658c5d2b84803e284bdb0f9ec13ef67bf54a26892286"} Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.629290 4961 scope.go:117] "RemoveContainer" containerID="0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.634784 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rldqh" event={"ID":"dbc19872-375c-4adc-88ca-bd7e84e57f63","Type":"ContainerStarted","Data":"e6b02cce14f6d2fd56d1cb3c4d6532a51f0511fc38d04c848957503b5392a463"} Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.649855 4961 scope.go:117] "RemoveContainer" containerID="68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.676430 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rldqh" podStartSLOduration=3.685429721 podStartE2EDuration="48.676403229s" podCreationTimestamp="2026-02-01 19:36:37 +0000 UTC" firstStartedPulling="2026-02-01 19:36:40.016775196 +0000 UTC m=+154.541806619" lastFinishedPulling="2026-02-01 19:37:25.007748704 +0000 UTC m=+199.532780127" observedRunningTime="2026-02-01 19:37:25.672485943 +0000 UTC m=+200.197517366" watchObservedRunningTime="2026-02-01 19:37:25.676403229 +0000 UTC m=+200.201434672" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.690278 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wbcn9"] Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.693602 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wbcn9"] Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.695817 4961 scope.go:117] "RemoveContainer" containerID="b0170ba4f9939e77bd96ca4261dea8a5bfa4f3ddd3feedb6a3dd7411e670d4cd" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.707111 4961 scope.go:117] "RemoveContainer" containerID="0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb" Feb 01 19:37:25 crc kubenswrapper[4961]: E0201 19:37:25.708633 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb\": container with ID starting with 0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb not found: ID does not exist" containerID="0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.708677 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb"} err="failed to get container status \"0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb\": rpc error: code = NotFound desc = could not find container \"0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb\": container with ID starting with 0dff522dab34e62beefb74d7ada4ad8a22624fcbe9aa046cf474c0affdb36afb not found: ID does not exist" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.708753 4961 scope.go:117] "RemoveContainer" containerID="68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf" Feb 01 19:37:25 crc kubenswrapper[4961]: E0201 19:37:25.709094 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf\": container with ID starting with 68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf not found: ID does not exist" containerID="68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.709123 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf"} err="failed to get container status \"68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf\": rpc error: code = NotFound desc = could not find container \"68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf\": container with ID starting with 68543f8fe4b2e2cf0d5156cd3da21c8b37053708cdb3fe5510ad798cb51f18cf not found: ID does not exist" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.709172 4961 scope.go:117] "RemoveContainer" containerID="b0170ba4f9939e77bd96ca4261dea8a5bfa4f3ddd3feedb6a3dd7411e670d4cd" Feb 01 19:37:25 crc kubenswrapper[4961]: E0201 19:37:25.709538 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0170ba4f9939e77bd96ca4261dea8a5bfa4f3ddd3feedb6a3dd7411e670d4cd\": container with ID starting with b0170ba4f9939e77bd96ca4261dea8a5bfa4f3ddd3feedb6a3dd7411e670d4cd not found: ID does not exist" containerID="b0170ba4f9939e77bd96ca4261dea8a5bfa4f3ddd3feedb6a3dd7411e670d4cd" Feb 01 19:37:25 crc kubenswrapper[4961]: I0201 19:37:25.709579 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0170ba4f9939e77bd96ca4261dea8a5bfa4f3ddd3feedb6a3dd7411e670d4cd"} err="failed to get container status \"b0170ba4f9939e77bd96ca4261dea8a5bfa4f3ddd3feedb6a3dd7411e670d4cd\": rpc error: code = NotFound desc = could not find container \"b0170ba4f9939e77bd96ca4261dea8a5bfa4f3ddd3feedb6a3dd7411e670d4cd\": container with ID starting with b0170ba4f9939e77bd96ca4261dea8a5bfa4f3ddd3feedb6a3dd7411e670d4cd not found: ID does not exist" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.244562 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="100501b4-f817-42f7-951e-cbfd5f111f35" path="/var/lib/kubelet/pods/100501b4-f817-42f7-951e-cbfd5f111f35/volumes" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.413185 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 01 19:37:26 crc kubenswrapper[4961]: E0201 19:37:26.413385 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100501b4-f817-42f7-951e-cbfd5f111f35" containerName="extract-utilities" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.413400 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="100501b4-f817-42f7-951e-cbfd5f111f35" containerName="extract-utilities" Feb 01 19:37:26 crc kubenswrapper[4961]: E0201 19:37:26.413409 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3567c5a9-ebc9-4eea-81cc-96953bc4ef4a" containerName="pruner" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.413415 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="3567c5a9-ebc9-4eea-81cc-96953bc4ef4a" containerName="pruner" Feb 01 19:37:26 crc kubenswrapper[4961]: E0201 19:37:26.413427 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100501b4-f817-42f7-951e-cbfd5f111f35" containerName="registry-server" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.413433 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="100501b4-f817-42f7-951e-cbfd5f111f35" containerName="registry-server" Feb 01 19:37:26 crc kubenswrapper[4961]: E0201 19:37:26.413448 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100501b4-f817-42f7-951e-cbfd5f111f35" containerName="extract-content" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.413455 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="100501b4-f817-42f7-951e-cbfd5f111f35" containerName="extract-content" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.413562 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="100501b4-f817-42f7-951e-cbfd5f111f35" containerName="registry-server" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.413577 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="3567c5a9-ebc9-4eea-81cc-96953bc4ef4a" containerName="pruner" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.413902 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.417027 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.421462 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.434901 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.494327 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-var-lock\") pod \"installer-9-crc\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.494644 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kube-api-access\") pod \"installer-9-crc\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.494763 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.596384 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-var-lock\") pod \"installer-9-crc\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.596493 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kube-api-access\") pod \"installer-9-crc\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.596533 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.596669 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kubelet-dir\") pod \"installer-9-crc\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.596729 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-var-lock\") pod \"installer-9-crc\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.624856 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kube-api-access\") pod \"installer-9-crc\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.642792 4961 generic.go:334] "Generic (PLEG): container finished" podID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" containerID="853b97815c74999678a2dd0ba0d12a36a111c4fb24715e2f08b70ba74263c25b" exitCode=0 Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.642849 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nz8k" event={"ID":"d3cea628-f22c-4b45-b5ba-e55e856cda1a","Type":"ContainerDied","Data":"853b97815c74999678a2dd0ba0d12a36a111c4fb24715e2f08b70ba74263c25b"} Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.645380 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjfx" event={"ID":"966c0385-f333-4a81-972f-9e5569c11e11","Type":"ContainerStarted","Data":"cc0a2d8045c1c17eaf025d1b206664559c23b3fde3a5287324cbfc61e865f4ba"} Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.649579 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbw2" event={"ID":"57b71857-2f5b-42af-be87-47936a224fe0","Type":"ContainerStarted","Data":"c0a09011249958fcf57f61c4eab77fcaf2b11f24d9cfe978faa816f855004522"} Feb 01 19:37:26 crc kubenswrapper[4961]: I0201 19:37:26.737161 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.205584 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7kjfx" podStartSLOduration=4.207912027 podStartE2EDuration="50.205549222s" podCreationTimestamp="2026-02-01 19:36:37 +0000 UTC" firstStartedPulling="2026-02-01 19:36:40.019324347 +0000 UTC m=+154.544355770" lastFinishedPulling="2026-02-01 19:37:26.016961542 +0000 UTC m=+200.541992965" observedRunningTime="2026-02-01 19:37:26.695783013 +0000 UTC m=+201.220814436" watchObservedRunningTime="2026-02-01 19:37:27.205549222 +0000 UTC m=+201.730580635" Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.209213 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 01 19:37:27 crc kubenswrapper[4961]: W0201 19:37:27.264626 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4bc77e5d_a2c6_48a6_acc9_dc952b52c4be.slice/crio-7e638b4b81f45995d32ba1711b653e40da093ecab3e53b7d64836d9fc67f29f1 WatchSource:0}: Error finding container 7e638b4b81f45995d32ba1711b653e40da093ecab3e53b7d64836d9fc67f29f1: Status 404 returned error can't find the container with id 7e638b4b81f45995d32ba1711b653e40da093ecab3e53b7d64836d9fc67f29f1 Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.659878 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be","Type":"ContainerStarted","Data":"883058cd4e6bb7ce6886e932b892c0b1bff0f70246a547c9dc1a3f10c5bbcac5"} Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.660307 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be","Type":"ContainerStarted","Data":"7e638b4b81f45995d32ba1711b653e40da093ecab3e53b7d64836d9fc67f29f1"} Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.662734 4961 generic.go:334] "Generic (PLEG): container finished" podID="50d7779f-dfee-45fc-b93e-693e9f069d37" containerID="1ecc5f094d8dec22348836b45b6a59d473b6759c385da8fbcf0fec0c3a765141" exitCode=0 Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.662795 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhb2" event={"ID":"50d7779f-dfee-45fc-b93e-693e9f069d37","Type":"ContainerDied","Data":"1ecc5f094d8dec22348836b45b6a59d473b6759c385da8fbcf0fec0c3a765141"} Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.667148 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nz8k" event={"ID":"d3cea628-f22c-4b45-b5ba-e55e856cda1a","Type":"ContainerStarted","Data":"ec4ebc0d368d1cdf7880a2eed4a1c89b68e6d2e356c7da0600cb401b91ba009e"} Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.669397 4961 generic.go:334] "Generic (PLEG): container finished" podID="df83dd41-a7f5-4daf-8d82-43067543488a" containerID="fffbbd1c868c9eef4e12bc21215b54299609a95e5c00251aecc1d84242a31ce2" exitCode=0 Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.669460 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fm57v" event={"ID":"df83dd41-a7f5-4daf-8d82-43067543488a","Type":"ContainerDied","Data":"fffbbd1c868c9eef4e12bc21215b54299609a95e5c00251aecc1d84242a31ce2"} Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.673063 4961 generic.go:334] "Generic (PLEG): container finished" podID="57b71857-2f5b-42af-be87-47936a224fe0" containerID="c0a09011249958fcf57f61c4eab77fcaf2b11f24d9cfe978faa816f855004522" exitCode=0 Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.673101 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbw2" event={"ID":"57b71857-2f5b-42af-be87-47936a224fe0","Type":"ContainerDied","Data":"c0a09011249958fcf57f61c4eab77fcaf2b11f24d9cfe978faa816f855004522"} Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.682131 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.6821103800000001 podStartE2EDuration="1.68211038s" podCreationTimestamp="2026-02-01 19:37:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:37:27.680916308 +0000 UTC m=+202.205947741" watchObservedRunningTime="2026-02-01 19:37:27.68211038 +0000 UTC m=+202.207141813" Feb 01 19:37:27 crc kubenswrapper[4961]: I0201 19:37:27.776091 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4nz8k" podStartSLOduration=1.79994467 podStartE2EDuration="47.776075668s" podCreationTimestamp="2026-02-01 19:36:40 +0000 UTC" firstStartedPulling="2026-02-01 19:36:41.036811432 +0000 UTC m=+155.561842855" lastFinishedPulling="2026-02-01 19:37:27.01294243 +0000 UTC m=+201.537973853" observedRunningTime="2026-02-01 19:37:27.773177039 +0000 UTC m=+202.298208492" watchObservedRunningTime="2026-02-01 19:37:27.776075668 +0000 UTC m=+202.301107091" Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.452484 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.453089 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.489161 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.679199 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.679279 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.702748 4961 generic.go:334] "Generic (PLEG): container finished" podID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" containerID="00a38726fcb15baabaf6fc88f76039411a8af8e4a127323cfad1623e21f6c716" exitCode=0 Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.702833 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk2b9" event={"ID":"98d49479-bf49-43f8-9e40-4e5f9ed0fb41","Type":"ContainerDied","Data":"00a38726fcb15baabaf6fc88f76039411a8af8e4a127323cfad1623e21f6c716"} Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.715454 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fm57v" event={"ID":"df83dd41-a7f5-4daf-8d82-43067543488a","Type":"ContainerStarted","Data":"33c48b3475cb12d9262847285cdb225e0968ec7ffa84a43e90d1e1af83a0e5f2"} Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.717967 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbw2" event={"ID":"57b71857-2f5b-42af-be87-47936a224fe0","Type":"ContainerStarted","Data":"a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc"} Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.719958 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.720442 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhb2" event={"ID":"50d7779f-dfee-45fc-b93e-693e9f069d37","Type":"ContainerStarted","Data":"c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e"} Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.753278 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fm57v" podStartSLOduration=2.771004728 podStartE2EDuration="49.753255188s" podCreationTimestamp="2026-02-01 19:36:39 +0000 UTC" firstStartedPulling="2026-02-01 19:36:41.051197955 +0000 UTC m=+155.576229378" lastFinishedPulling="2026-02-01 19:37:28.033448415 +0000 UTC m=+202.558479838" observedRunningTime="2026-02-01 19:37:28.750419321 +0000 UTC m=+203.275450744" watchObservedRunningTime="2026-02-01 19:37:28.753255188 +0000 UTC m=+203.278286611" Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.773741 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jlbw2" podStartSLOduration=3.8435665180000003 podStartE2EDuration="48.773722952s" podCreationTimestamp="2026-02-01 19:36:40 +0000 UTC" firstStartedPulling="2026-02-01 19:36:43.135246197 +0000 UTC m=+157.660277620" lastFinishedPulling="2026-02-01 19:37:28.065402621 +0000 UTC m=+202.590434054" observedRunningTime="2026-02-01 19:37:28.76883948 +0000 UTC m=+203.293870903" watchObservedRunningTime="2026-02-01 19:37:28.773722952 +0000 UTC m=+203.298754375" Feb 01 19:37:28 crc kubenswrapper[4961]: I0201 19:37:28.784704 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jdhb2" podStartSLOduration=3.717240337 podStartE2EDuration="50.784688829s" podCreationTimestamp="2026-02-01 19:36:38 +0000 UTC" firstStartedPulling="2026-02-01 19:36:41.060750152 +0000 UTC m=+155.585781575" lastFinishedPulling="2026-02-01 19:37:28.128198644 +0000 UTC m=+202.653230067" observedRunningTime="2026-02-01 19:37:28.783982991 +0000 UTC m=+203.309014414" watchObservedRunningTime="2026-02-01 19:37:28.784688829 +0000 UTC m=+203.309720252" Feb 01 19:37:29 crc kubenswrapper[4961]: I0201 19:37:29.730163 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk2b9" event={"ID":"98d49479-bf49-43f8-9e40-4e5f9ed0fb41","Type":"ContainerStarted","Data":"83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da"} Feb 01 19:37:29 crc kubenswrapper[4961]: I0201 19:37:29.756829 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dk2b9" podStartSLOduration=3.711860359 podStartE2EDuration="51.756810162s" podCreationTimestamp="2026-02-01 19:36:38 +0000 UTC" firstStartedPulling="2026-02-01 19:36:41.047164311 +0000 UTC m=+155.572195734" lastFinishedPulling="2026-02-01 19:37:29.092114114 +0000 UTC m=+203.617145537" observedRunningTime="2026-02-01 19:37:29.753622916 +0000 UTC m=+204.278654369" watchObservedRunningTime="2026-02-01 19:37:29.756810162 +0000 UTC m=+204.281841585" Feb 01 19:37:30 crc kubenswrapper[4961]: I0201 19:37:30.260271 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:37:30 crc kubenswrapper[4961]: I0201 19:37:30.260316 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:37:30 crc kubenswrapper[4961]: I0201 19:37:30.302928 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:37:30 crc kubenswrapper[4961]: I0201 19:37:30.516555 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:37:30 crc kubenswrapper[4961]: I0201 19:37:30.516605 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:37:30 crc kubenswrapper[4961]: I0201 19:37:30.556015 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:37:31 crc kubenswrapper[4961]: I0201 19:37:31.340956 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:37:31 crc kubenswrapper[4961]: I0201 19:37:31.341379 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:37:32 crc kubenswrapper[4961]: I0201 19:37:32.404777 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jlbw2" podUID="57b71857-2f5b-42af-be87-47936a224fe0" containerName="registry-server" probeResult="failure" output=< Feb 01 19:37:32 crc kubenswrapper[4961]: timeout: failed to connect service ":50051" within 1s Feb 01 19:37:32 crc kubenswrapper[4961]: > Feb 01 19:37:38 crc kubenswrapper[4961]: I0201 19:37:38.531259 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:37:38 crc kubenswrapper[4961]: I0201 19:37:38.607686 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:37:38 crc kubenswrapper[4961]: I0201 19:37:38.607731 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:37:38 crc kubenswrapper[4961]: I0201 19:37:38.665710 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:37:38 crc kubenswrapper[4961]: I0201 19:37:38.741050 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:37:38 crc kubenswrapper[4961]: I0201 19:37:38.749654 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:37:38 crc kubenswrapper[4961]: I0201 19:37:38.749718 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:37:38 crc kubenswrapper[4961]: I0201 19:37:38.822480 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:37:38 crc kubenswrapper[4961]: I0201 19:37:38.853998 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:37:38 crc kubenswrapper[4961]: I0201 19:37:38.869436 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:37:40 crc kubenswrapper[4961]: I0201 19:37:40.026122 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jdhb2"] Feb 01 19:37:40 crc kubenswrapper[4961]: I0201 19:37:40.332986 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:37:40 crc kubenswrapper[4961]: I0201 19:37:40.572572 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:37:40 crc kubenswrapper[4961]: I0201 19:37:40.795475 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jdhb2" podUID="50d7779f-dfee-45fc-b93e-693e9f069d37" containerName="registry-server" containerID="cri-o://c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e" gracePeriod=2 Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.032620 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dk2b9"] Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.032875 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dk2b9" podUID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" containerName="registry-server" containerID="cri-o://83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da" gracePeriod=2 Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.204021 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.242534 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-utilities\") pod \"50d7779f-dfee-45fc-b93e-693e9f069d37\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.242683 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-catalog-content\") pod \"50d7779f-dfee-45fc-b93e-693e9f069d37\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.242742 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk9nk\" (UniqueName: \"kubernetes.io/projected/50d7779f-dfee-45fc-b93e-693e9f069d37-kube-api-access-kk9nk\") pod \"50d7779f-dfee-45fc-b93e-693e9f069d37\" (UID: \"50d7779f-dfee-45fc-b93e-693e9f069d37\") " Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.243764 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-utilities" (OuterVolumeSpecName: "utilities") pod "50d7779f-dfee-45fc-b93e-693e9f069d37" (UID: "50d7779f-dfee-45fc-b93e-693e9f069d37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.261317 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d7779f-dfee-45fc-b93e-693e9f069d37-kube-api-access-kk9nk" (OuterVolumeSpecName: "kube-api-access-kk9nk") pod "50d7779f-dfee-45fc-b93e-693e9f069d37" (UID: "50d7779f-dfee-45fc-b93e-693e9f069d37"). InnerVolumeSpecName "kube-api-access-kk9nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.302290 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50d7779f-dfee-45fc-b93e-693e9f069d37" (UID: "50d7779f-dfee-45fc-b93e-693e9f069d37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.344609 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.344669 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50d7779f-dfee-45fc-b93e-693e9f069d37-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.344727 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk9nk\" (UniqueName: \"kubernetes.io/projected/50d7779f-dfee-45fc-b93e-693e9f069d37-kube-api-access-kk9nk\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.380289 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.428517 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.430812 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.547557 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr6xs\" (UniqueName: \"kubernetes.io/projected/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-kube-api-access-lr6xs\") pod \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.547625 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-catalog-content\") pod \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.547691 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-utilities\") pod \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\" (UID: \"98d49479-bf49-43f8-9e40-4e5f9ed0fb41\") " Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.548443 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-utilities" (OuterVolumeSpecName: "utilities") pod "98d49479-bf49-43f8-9e40-4e5f9ed0fb41" (UID: "98d49479-bf49-43f8-9e40-4e5f9ed0fb41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.551482 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-kube-api-access-lr6xs" (OuterVolumeSpecName: "kube-api-access-lr6xs") pod "98d49479-bf49-43f8-9e40-4e5f9ed0fb41" (UID: "98d49479-bf49-43f8-9e40-4e5f9ed0fb41"). InnerVolumeSpecName "kube-api-access-lr6xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.588155 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98d49479-bf49-43f8-9e40-4e5f9ed0fb41" (UID: "98d49479-bf49-43f8-9e40-4e5f9ed0fb41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.648673 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.648709 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr6xs\" (UniqueName: \"kubernetes.io/projected/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-kube-api-access-lr6xs\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.648719 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98d49479-bf49-43f8-9e40-4e5f9ed0fb41-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.804453 4961 generic.go:334] "Generic (PLEG): container finished" podID="50d7779f-dfee-45fc-b93e-693e9f069d37" containerID="c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e" exitCode=0 Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.804560 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jdhb2" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.804586 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhb2" event={"ID":"50d7779f-dfee-45fc-b93e-693e9f069d37","Type":"ContainerDied","Data":"c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e"} Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.804630 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jdhb2" event={"ID":"50d7779f-dfee-45fc-b93e-693e9f069d37","Type":"ContainerDied","Data":"c305247e85e8ed5fa8bfd4de9664e531e1e5d5053b2a449c539c7259e6ba9422"} Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.804656 4961 scope.go:117] "RemoveContainer" containerID="c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.808736 4961 generic.go:334] "Generic (PLEG): container finished" podID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" containerID="83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da" exitCode=0 Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.808791 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk2b9" event={"ID":"98d49479-bf49-43f8-9e40-4e5f9ed0fb41","Type":"ContainerDied","Data":"83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da"} Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.808847 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dk2b9" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.808860 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dk2b9" event={"ID":"98d49479-bf49-43f8-9e40-4e5f9ed0fb41","Type":"ContainerDied","Data":"37eb68f96cf498a3de0fea20b59aac441f0bf29e7b58c2b801bdb1774dc0ebca"} Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.826002 4961 scope.go:117] "RemoveContainer" containerID="1ecc5f094d8dec22348836b45b6a59d473b6759c385da8fbcf0fec0c3a765141" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.852428 4961 scope.go:117] "RemoveContainer" containerID="a24bab076287754c9c9bf4c931775ba6f5c99a71276332468b5a6287b20af352" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.880241 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jdhb2"] Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.882682 4961 scope.go:117] "RemoveContainer" containerID="c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e" Feb 01 19:37:41 crc kubenswrapper[4961]: E0201 19:37:41.883154 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e\": container with ID starting with c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e not found: ID does not exist" containerID="c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.883209 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e"} err="failed to get container status \"c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e\": rpc error: code = NotFound desc = could not find container \"c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e\": container with ID starting with c61a6b4c8216b1808472de85a566555f1101a570cf5e1deee89d280d84a42f7e not found: ID does not exist" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.883247 4961 scope.go:117] "RemoveContainer" containerID="1ecc5f094d8dec22348836b45b6a59d473b6759c385da8fbcf0fec0c3a765141" Feb 01 19:37:41 crc kubenswrapper[4961]: E0201 19:37:41.883667 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ecc5f094d8dec22348836b45b6a59d473b6759c385da8fbcf0fec0c3a765141\": container with ID starting with 1ecc5f094d8dec22348836b45b6a59d473b6759c385da8fbcf0fec0c3a765141 not found: ID does not exist" containerID="1ecc5f094d8dec22348836b45b6a59d473b6759c385da8fbcf0fec0c3a765141" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.883734 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ecc5f094d8dec22348836b45b6a59d473b6759c385da8fbcf0fec0c3a765141"} err="failed to get container status \"1ecc5f094d8dec22348836b45b6a59d473b6759c385da8fbcf0fec0c3a765141\": rpc error: code = NotFound desc = could not find container \"1ecc5f094d8dec22348836b45b6a59d473b6759c385da8fbcf0fec0c3a765141\": container with ID starting with 1ecc5f094d8dec22348836b45b6a59d473b6759c385da8fbcf0fec0c3a765141 not found: ID does not exist" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.883810 4961 scope.go:117] "RemoveContainer" containerID="a24bab076287754c9c9bf4c931775ba6f5c99a71276332468b5a6287b20af352" Feb 01 19:37:41 crc kubenswrapper[4961]: E0201 19:37:41.884263 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a24bab076287754c9c9bf4c931775ba6f5c99a71276332468b5a6287b20af352\": container with ID starting with a24bab076287754c9c9bf4c931775ba6f5c99a71276332468b5a6287b20af352 not found: ID does not exist" containerID="a24bab076287754c9c9bf4c931775ba6f5c99a71276332468b5a6287b20af352" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.884300 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24bab076287754c9c9bf4c931775ba6f5c99a71276332468b5a6287b20af352"} err="failed to get container status \"a24bab076287754c9c9bf4c931775ba6f5c99a71276332468b5a6287b20af352\": rpc error: code = NotFound desc = could not find container \"a24bab076287754c9c9bf4c931775ba6f5c99a71276332468b5a6287b20af352\": container with ID starting with a24bab076287754c9c9bf4c931775ba6f5c99a71276332468b5a6287b20af352 not found: ID does not exist" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.884329 4961 scope.go:117] "RemoveContainer" containerID="83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.885629 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jdhb2"] Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.892809 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dk2b9"] Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.897417 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dk2b9"] Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.897602 4961 scope.go:117] "RemoveContainer" containerID="00a38726fcb15baabaf6fc88f76039411a8af8e4a127323cfad1623e21f6c716" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.922674 4961 scope.go:117] "RemoveContainer" containerID="6e38e41a77a05699411e4c26572e86fb95d67f845947f52340555d6d5c8ba466" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.939547 4961 scope.go:117] "RemoveContainer" containerID="83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da" Feb 01 19:37:41 crc kubenswrapper[4961]: E0201 19:37:41.939981 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da\": container with ID starting with 83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da not found: ID does not exist" containerID="83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.940025 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da"} err="failed to get container status \"83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da\": rpc error: code = NotFound desc = could not find container \"83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da\": container with ID starting with 83e867d1a4c106dd197e419b040a3a70421da01d991e791e3a555d008b0413da not found: ID does not exist" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.940112 4961 scope.go:117] "RemoveContainer" containerID="00a38726fcb15baabaf6fc88f76039411a8af8e4a127323cfad1623e21f6c716" Feb 01 19:37:41 crc kubenswrapper[4961]: E0201 19:37:41.941971 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00a38726fcb15baabaf6fc88f76039411a8af8e4a127323cfad1623e21f6c716\": container with ID starting with 00a38726fcb15baabaf6fc88f76039411a8af8e4a127323cfad1623e21f6c716 not found: ID does not exist" containerID="00a38726fcb15baabaf6fc88f76039411a8af8e4a127323cfad1623e21f6c716" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.942088 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a38726fcb15baabaf6fc88f76039411a8af8e4a127323cfad1623e21f6c716"} err="failed to get container status \"00a38726fcb15baabaf6fc88f76039411a8af8e4a127323cfad1623e21f6c716\": rpc error: code = NotFound desc = could not find container \"00a38726fcb15baabaf6fc88f76039411a8af8e4a127323cfad1623e21f6c716\": container with ID starting with 00a38726fcb15baabaf6fc88f76039411a8af8e4a127323cfad1623e21f6c716 not found: ID does not exist" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.942153 4961 scope.go:117] "RemoveContainer" containerID="6e38e41a77a05699411e4c26572e86fb95d67f845947f52340555d6d5c8ba466" Feb 01 19:37:41 crc kubenswrapper[4961]: E0201 19:37:41.942644 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e38e41a77a05699411e4c26572e86fb95d67f845947f52340555d6d5c8ba466\": container with ID starting with 6e38e41a77a05699411e4c26572e86fb95d67f845947f52340555d6d5c8ba466 not found: ID does not exist" containerID="6e38e41a77a05699411e4c26572e86fb95d67f845947f52340555d6d5c8ba466" Feb 01 19:37:41 crc kubenswrapper[4961]: I0201 19:37:41.942697 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e38e41a77a05699411e4c26572e86fb95d67f845947f52340555d6d5c8ba466"} err="failed to get container status \"6e38e41a77a05699411e4c26572e86fb95d67f845947f52340555d6d5c8ba466\": rpc error: code = NotFound desc = could not find container \"6e38e41a77a05699411e4c26572e86fb95d67f845947f52340555d6d5c8ba466\": container with ID starting with 6e38e41a77a05699411e4c26572e86fb95d67f845947f52340555d6d5c8ba466 not found: ID does not exist" Feb 01 19:37:42 crc kubenswrapper[4961]: I0201 19:37:42.247636 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50d7779f-dfee-45fc-b93e-693e9f069d37" path="/var/lib/kubelet/pods/50d7779f-dfee-45fc-b93e-693e9f069d37/volumes" Feb 01 19:37:42 crc kubenswrapper[4961]: I0201 19:37:42.249347 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" path="/var/lib/kubelet/pods/98d49479-bf49-43f8-9e40-4e5f9ed0fb41/volumes" Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.429234 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nz8k"] Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.429848 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4nz8k" podUID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" containerName="registry-server" containerID="cri-o://ec4ebc0d368d1cdf7880a2eed4a1c89b68e6d2e356c7da0600cb401b91ba009e" gracePeriod=2 Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.828657 4961 generic.go:334] "Generic (PLEG): container finished" podID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" containerID="ec4ebc0d368d1cdf7880a2eed4a1c89b68e6d2e356c7da0600cb401b91ba009e" exitCode=0 Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.828719 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nz8k" event={"ID":"d3cea628-f22c-4b45-b5ba-e55e856cda1a","Type":"ContainerDied","Data":"ec4ebc0d368d1cdf7880a2eed4a1c89b68e6d2e356c7da0600cb401b91ba009e"} Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.828781 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nz8k" event={"ID":"d3cea628-f22c-4b45-b5ba-e55e856cda1a","Type":"ContainerDied","Data":"9127253d304141e950452e9851b7e7dc0ccfee9ebb79a9d6ad8c955bd7863ad4"} Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.828802 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9127253d304141e950452e9851b7e7dc0ccfee9ebb79a9d6ad8c955bd7863ad4" Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.834871 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.882657 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-catalog-content\") pod \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.882701 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8cdd\" (UniqueName: \"kubernetes.io/projected/d3cea628-f22c-4b45-b5ba-e55e856cda1a-kube-api-access-s8cdd\") pod \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.882845 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-utilities\") pod \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\" (UID: \"d3cea628-f22c-4b45-b5ba-e55e856cda1a\") " Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.884102 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-utilities" (OuterVolumeSpecName: "utilities") pod "d3cea628-f22c-4b45-b5ba-e55e856cda1a" (UID: "d3cea628-f22c-4b45-b5ba-e55e856cda1a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.889583 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3cea628-f22c-4b45-b5ba-e55e856cda1a-kube-api-access-s8cdd" (OuterVolumeSpecName: "kube-api-access-s8cdd") pod "d3cea628-f22c-4b45-b5ba-e55e856cda1a" (UID: "d3cea628-f22c-4b45-b5ba-e55e856cda1a"). InnerVolumeSpecName "kube-api-access-s8cdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:37:43 crc kubenswrapper[4961]: I0201 19:37:43.913542 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3cea628-f22c-4b45-b5ba-e55e856cda1a" (UID: "d3cea628-f22c-4b45-b5ba-e55e856cda1a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:37:44 crc kubenswrapper[4961]: I0201 19:37:44.338024 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:44 crc kubenswrapper[4961]: I0201 19:37:44.338078 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8cdd\" (UniqueName: \"kubernetes.io/projected/d3cea628-f22c-4b45-b5ba-e55e856cda1a-kube-api-access-s8cdd\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:44 crc kubenswrapper[4961]: I0201 19:37:44.338093 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3cea628-f22c-4b45-b5ba-e55e856cda1a-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:44 crc kubenswrapper[4961]: I0201 19:37:44.455454 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" podUID="20a2004c-d84f-4be5-b205-70e27386ad37" containerName="oauth-openshift" containerID="cri-o://5e9de551bddfbf9190d413e647e605f55a919f1904ecf91884cafc1a6b725d07" gracePeriod=15 Feb 01 19:37:44 crc kubenswrapper[4961]: I0201 19:37:44.837659 4961 generic.go:334] "Generic (PLEG): container finished" podID="20a2004c-d84f-4be5-b205-70e27386ad37" containerID="5e9de551bddfbf9190d413e647e605f55a919f1904ecf91884cafc1a6b725d07" exitCode=0 Feb 01 19:37:44 crc kubenswrapper[4961]: I0201 19:37:44.837725 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" event={"ID":"20a2004c-d84f-4be5-b205-70e27386ad37","Type":"ContainerDied","Data":"5e9de551bddfbf9190d413e647e605f55a919f1904ecf91884cafc1a6b725d07"} Feb 01 19:37:44 crc kubenswrapper[4961]: I0201 19:37:44.838271 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nz8k" Feb 01 19:37:44 crc kubenswrapper[4961]: I0201 19:37:44.869816 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nz8k"] Feb 01 19:37:44 crc kubenswrapper[4961]: I0201 19:37:44.873357 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nz8k"] Feb 01 19:37:44 crc kubenswrapper[4961]: I0201 19:37:44.905955 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046494 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-trusted-ca-bundle\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046558 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-session\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046584 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-provider-selection\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046608 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcxq2\" (UniqueName: \"kubernetes.io/projected/20a2004c-d84f-4be5-b205-70e27386ad37-kube-api-access-bcxq2\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046633 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-idp-0-file-data\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046664 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-router-certs\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046692 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-error\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046711 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20a2004c-d84f-4be5-b205-70e27386ad37-audit-dir\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046729 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-service-ca\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046745 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-audit-policies\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046761 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-login\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046802 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-serving-cert\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046818 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-cliconfig\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.046838 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-ocp-branding-template\") pod \"20a2004c-d84f-4be5-b205-70e27386ad37\" (UID: \"20a2004c-d84f-4be5-b205-70e27386ad37\") " Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.047530 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.048003 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20a2004c-d84f-4be5-b205-70e27386ad37-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.048096 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.048531 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.048726 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.051541 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a2004c-d84f-4be5-b205-70e27386ad37-kube-api-access-bcxq2" (OuterVolumeSpecName: "kube-api-access-bcxq2") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "kube-api-access-bcxq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.051726 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.052051 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.052294 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.052502 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.055752 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.055813 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.056350 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.056499 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "20a2004c-d84f-4be5-b205-70e27386ad37" (UID: "20a2004c-d84f-4be5-b205-70e27386ad37"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148390 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148432 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148447 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148459 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148472 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148485 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148501 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcxq2\" (UniqueName: \"kubernetes.io/projected/20a2004c-d84f-4be5-b205-70e27386ad37-kube-api-access-bcxq2\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148513 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148524 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148535 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148549 4961 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20a2004c-d84f-4be5-b205-70e27386ad37-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148563 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148575 4961 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20a2004c-d84f-4be5-b205-70e27386ad37-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.148589 4961 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20a2004c-d84f-4be5-b205-70e27386ad37-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.847409 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" event={"ID":"20a2004c-d84f-4be5-b205-70e27386ad37","Type":"ContainerDied","Data":"5956def3689d2176baf87cb40e98b6ccfa62593259a3c0fd9e81aa596c738b23"} Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.847499 4961 scope.go:117] "RemoveContainer" containerID="5e9de551bddfbf9190d413e647e605f55a919f1904ecf91884cafc1a6b725d07" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.847703 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-597bm" Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.883961 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-597bm"] Feb 01 19:37:45 crc kubenswrapper[4961]: I0201 19:37:45.886931 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-597bm"] Feb 01 19:37:46 crc kubenswrapper[4961]: I0201 19:37:46.243223 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a2004c-d84f-4be5-b205-70e27386ad37" path="/var/lib/kubelet/pods/20a2004c-d84f-4be5-b205-70e27386ad37/volumes" Feb 01 19:37:46 crc kubenswrapper[4961]: I0201 19:37:46.243953 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" path="/var/lib/kubelet/pods/d3cea628-f22c-4b45-b5ba-e55e856cda1a/volumes" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.789661 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-9d558594f-cfff6"] Feb 01 19:37:49 crc kubenswrapper[4961]: E0201 19:37:49.790171 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" containerName="extract-utilities" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790186 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" containerName="extract-utilities" Feb 01 19:37:49 crc kubenswrapper[4961]: E0201 19:37:49.790199 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" containerName="extract-content" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790205 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" containerName="extract-content" Feb 01 19:37:49 crc kubenswrapper[4961]: E0201 19:37:49.790215 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" containerName="extract-utilities" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790222 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" containerName="extract-utilities" Feb 01 19:37:49 crc kubenswrapper[4961]: E0201 19:37:49.790230 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d7779f-dfee-45fc-b93e-693e9f069d37" containerName="extract-content" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790236 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d7779f-dfee-45fc-b93e-693e9f069d37" containerName="extract-content" Feb 01 19:37:49 crc kubenswrapper[4961]: E0201 19:37:49.790241 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d7779f-dfee-45fc-b93e-693e9f069d37" containerName="registry-server" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790247 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d7779f-dfee-45fc-b93e-693e9f069d37" containerName="registry-server" Feb 01 19:37:49 crc kubenswrapper[4961]: E0201 19:37:49.790257 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d7779f-dfee-45fc-b93e-693e9f069d37" containerName="extract-utilities" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790262 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d7779f-dfee-45fc-b93e-693e9f069d37" containerName="extract-utilities" Feb 01 19:37:49 crc kubenswrapper[4961]: E0201 19:37:49.790270 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" containerName="registry-server" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790276 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" containerName="registry-server" Feb 01 19:37:49 crc kubenswrapper[4961]: E0201 19:37:49.790314 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" containerName="extract-content" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790319 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" containerName="extract-content" Feb 01 19:37:49 crc kubenswrapper[4961]: E0201 19:37:49.790331 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a2004c-d84f-4be5-b205-70e27386ad37" containerName="oauth-openshift" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790336 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a2004c-d84f-4be5-b205-70e27386ad37" containerName="oauth-openshift" Feb 01 19:37:49 crc kubenswrapper[4961]: E0201 19:37:49.790344 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" containerName="registry-server" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790349 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" containerName="registry-server" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790436 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d7779f-dfee-45fc-b93e-693e9f069d37" containerName="registry-server" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790446 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d49479-bf49-43f8-9e40-4e5f9ed0fb41" containerName="registry-server" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790454 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3cea628-f22c-4b45-b5ba-e55e856cda1a" containerName="registry-server" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790461 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a2004c-d84f-4be5-b205-70e27386ad37" containerName="oauth-openshift" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.790794 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.798294 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.798333 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.798645 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.798768 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.798858 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.799253 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.799458 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.801800 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.802149 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.802181 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.802260 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.806816 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9d558594f-cfff6"] Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.807428 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.820398 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.828991 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.832424 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.917846 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.917932 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-audit-policies\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.917968 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.917993 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.918016 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.918065 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-service-ca\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.918092 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-template-error\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.918216 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/07290702-e530-478b-aa6f-432c7c65f651-audit-dir\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.918248 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.918288 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-template-login\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.918353 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlhbz\" (UniqueName: \"kubernetes.io/projected/07290702-e530-478b-aa6f-432c7c65f651-kube-api-access-qlhbz\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.918389 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-session\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.918426 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-router-certs\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:49 crc kubenswrapper[4961]: I0201 19:37:49.918448 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019554 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-router-certs\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019597 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019634 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019669 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-audit-policies\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019694 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019716 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019734 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019754 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-service-ca\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019774 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-template-error\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019791 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/07290702-e530-478b-aa6f-432c7c65f651-audit-dir\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019811 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019838 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-template-login\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019861 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlhbz\" (UniqueName: \"kubernetes.io/projected/07290702-e530-478b-aa6f-432c7c65f651-kube-api-access-qlhbz\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.019884 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-session\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.021125 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-audit-policies\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.021315 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/07290702-e530-478b-aa6f-432c7c65f651-audit-dir\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.021432 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-service-ca\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.021625 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-cliconfig\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.022379 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.027018 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.027062 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-router-certs\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.027328 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-serving-cert\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.027430 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-template-login\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.027534 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-session\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.027634 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.028760 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.033416 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/07290702-e530-478b-aa6f-432c7c65f651-v4-0-config-user-template-error\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.039522 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlhbz\" (UniqueName: \"kubernetes.io/projected/07290702-e530-478b-aa6f-432c7c65f651-kube-api-access-qlhbz\") pod \"oauth-openshift-9d558594f-cfff6\" (UID: \"07290702-e530-478b-aa6f-432c7c65f651\") " pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.107304 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.281427 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.281925 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.281990 4961 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.282805 4961 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361"} pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.282885 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" containerID="cri-o://f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361" gracePeriod=600 Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.516054 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-9d558594f-cfff6"] Feb 01 19:37:50 crc kubenswrapper[4961]: W0201 19:37:50.519676 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07290702_e530_478b_aa6f_432c7c65f651.slice/crio-95fe386afba32f4946dd2d67bc3ab25a790976e08a86320feb0520a190c091c5 WatchSource:0}: Error finding container 95fe386afba32f4946dd2d67bc3ab25a790976e08a86320feb0520a190c091c5: Status 404 returned error can't find the container with id 95fe386afba32f4946dd2d67bc3ab25a790976e08a86320feb0520a190c091c5 Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.887888 4961 generic.go:334] "Generic (PLEG): container finished" podID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerID="f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361" exitCode=0 Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.888906 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerDied","Data":"f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361"} Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.889470 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerStarted","Data":"4e080d5473795bf00d433b386421de436789648c183cfb19241bd3f1c6492a31"} Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.891326 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" event={"ID":"07290702-e530-478b-aa6f-432c7c65f651","Type":"ContainerStarted","Data":"01041038d5b9b0806932fac392e0a9fc719cf13990d62ba4ba3fd3e430fb715a"} Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.891355 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" event={"ID":"07290702-e530-478b-aa6f-432c7c65f651","Type":"ContainerStarted","Data":"95fe386afba32f4946dd2d67bc3ab25a790976e08a86320feb0520a190c091c5"} Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.891728 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:37:50 crc kubenswrapper[4961]: I0201 19:37:50.931570 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" podStartSLOduration=31.93151683 podStartE2EDuration="31.93151683s" podCreationTimestamp="2026-02-01 19:37:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:37:50.927729717 +0000 UTC m=+225.452761140" watchObservedRunningTime="2026-02-01 19:37:50.93151683 +0000 UTC m=+225.456548253" Feb 01 19:37:51 crc kubenswrapper[4961]: I0201 19:37:51.296309 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-9d558594f-cfff6" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.324704 4961 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.327065 4961 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.327488 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89" gracePeriod=15 Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.327648 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a" gracePeriod=15 Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.327658 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535" gracePeriod=15 Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.327723 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4" gracePeriod=15 Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.327808 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b" gracePeriod=15 Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.327904 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.328574 4961 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 19:38:05 crc kubenswrapper[4961]: E0201 19:38:05.328806 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.328826 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 01 19:38:05 crc kubenswrapper[4961]: E0201 19:38:05.328855 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.328881 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 01 19:38:05 crc kubenswrapper[4961]: E0201 19:38:05.328904 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.328922 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 01 19:38:05 crc kubenswrapper[4961]: E0201 19:38:05.328945 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.328960 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 19:38:05 crc kubenswrapper[4961]: E0201 19:38:05.328982 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.328996 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 19:38:05 crc kubenswrapper[4961]: E0201 19:38:05.329017 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.329029 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 01 19:38:05 crc kubenswrapper[4961]: E0201 19:38:05.329084 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.329096 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.329262 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.329372 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.329392 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.329413 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.329433 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.330897 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.333362 4961 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 01 19:38:05 crc kubenswrapper[4961]: E0201 19:38:05.372234 4961 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.119:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.434584 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.434644 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.434673 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.434688 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.434715 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.434736 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.434773 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.434794 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536249 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536300 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536321 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536349 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536378 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536392 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536419 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536409 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536473 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536463 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536542 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536436 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536507 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536547 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536518 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.536523 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.673778 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:05 crc kubenswrapper[4961]: W0201 19:38:05.694987 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-36ea34774d2d5bbddfbb9a727f3309a8a64384260d3ddf4b87d3f9efca5024ee WatchSource:0}: Error finding container 36ea34774d2d5bbddfbb9a727f3309a8a64384260d3ddf4b87d3f9efca5024ee: Status 404 returned error can't find the container with id 36ea34774d2d5bbddfbb9a727f3309a8a64384260d3ddf4b87d3f9efca5024ee Feb 01 19:38:05 crc kubenswrapper[4961]: E0201 19:38:05.699378 4961 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.119:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18903699527f5d0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 19:38:05.698579727 +0000 UTC m=+240.223611151,LastTimestamp:2026-02-01 19:38:05.698579727 +0000 UTC m=+240.223611151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.992213 4961 generic.go:334] "Generic (PLEG): container finished" podID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" containerID="883058cd4e6bb7ce6886e932b892c0b1bff0f70246a547c9dc1a3f10c5bbcac5" exitCode=0 Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.992305 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be","Type":"ContainerDied","Data":"883058cd4e6bb7ce6886e932b892c0b1bff0f70246a547c9dc1a3f10c5bbcac5"} Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.993097 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.996013 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.998680 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.999519 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535" exitCode=0 Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.999540 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a" exitCode=0 Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.999549 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4" exitCode=0 Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.999557 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b" exitCode=2 Feb 01 19:38:05 crc kubenswrapper[4961]: I0201 19:38:05.999733 4961 scope.go:117] "RemoveContainer" containerID="8b0de2045760dfe53b0a9320314cd60e601daf4831906678611890e4e12e8e59" Feb 01 19:38:06 crc kubenswrapper[4961]: I0201 19:38:06.002824 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07"} Feb 01 19:38:06 crc kubenswrapper[4961]: I0201 19:38:06.002882 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"36ea34774d2d5bbddfbb9a727f3309a8a64384260d3ddf4b87d3f9efca5024ee"} Feb 01 19:38:06 crc kubenswrapper[4961]: I0201 19:38:06.003721 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:06 crc kubenswrapper[4961]: E0201 19:38:06.003844 4961 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.119:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:06 crc kubenswrapper[4961]: I0201 19:38:06.236694 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:06 crc kubenswrapper[4961]: E0201 19:38:06.326615 4961 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.119:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18903699527f5d0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 19:38:05.698579727 +0000 UTC m=+240.223611151,LastTimestamp:2026-02-01 19:38:05.698579727 +0000 UTC m=+240.223611151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 01 19:38:06 crc kubenswrapper[4961]: E0201 19:38:06.397862 4961 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:06 crc kubenswrapper[4961]: E0201 19:38:06.398278 4961 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:06 crc kubenswrapper[4961]: E0201 19:38:06.398544 4961 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:06 crc kubenswrapper[4961]: E0201 19:38:06.398944 4961 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:06 crc kubenswrapper[4961]: E0201 19:38:06.399514 4961 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:06 crc kubenswrapper[4961]: I0201 19:38:06.399565 4961 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 01 19:38:06 crc kubenswrapper[4961]: E0201 19:38:06.399937 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="200ms" Feb 01 19:38:06 crc kubenswrapper[4961]: E0201 19:38:06.600823 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="400ms" Feb 01 19:38:07 crc kubenswrapper[4961]: E0201 19:38:07.002551 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="800ms" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.013899 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.330813 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.340928 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.470935 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-var-lock\") pod \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.471022 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kube-api-access\") pod \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.471073 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kubelet-dir\") pod \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\" (UID: \"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be\") " Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.471379 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" (UID: "4bc77e5d-a2c6-48a6-acc9-dc952b52c4be"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.471417 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-var-lock" (OuterVolumeSpecName: "var-lock") pod "4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" (UID: "4bc77e5d-a2c6-48a6-acc9-dc952b52c4be"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.482671 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" (UID: "4bc77e5d-a2c6-48a6-acc9-dc952b52c4be"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.572919 4961 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-var-lock\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.572975 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.572986 4961 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4bc77e5d-a2c6-48a6-acc9-dc952b52c4be-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.696289 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.696755 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.697188 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.697355 4961 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:07 crc kubenswrapper[4961]: E0201 19:38:07.803984 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="1.6s" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.875701 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.875752 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.875807 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.876010 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.876056 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.876072 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.976561 4961 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.976586 4961 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:07 crc kubenswrapper[4961]: I0201 19:38:07.976596 4961 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.025888 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.027647 4961 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89" exitCode=0 Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.027746 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.029751 4961 scope.go:117] "RemoveContainer" containerID="cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.036413 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"4bc77e5d-a2c6-48a6-acc9-dc952b52c4be","Type":"ContainerDied","Data":"7e638b4b81f45995d32ba1711b653e40da093ecab3e53b7d64836d9fc67f29f1"} Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.036444 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e638b4b81f45995d32ba1711b653e40da093ecab3e53b7d64836d9fc67f29f1" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.036490 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.054789 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.055087 4961 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.055280 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.055422 4961 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.057585 4961 scope.go:117] "RemoveContainer" containerID="6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.079106 4961 scope.go:117] "RemoveContainer" containerID="501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.098103 4961 scope.go:117] "RemoveContainer" containerID="b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.157894 4961 scope.go:117] "RemoveContainer" containerID="229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.178805 4961 scope.go:117] "RemoveContainer" containerID="98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.202738 4961 scope.go:117] "RemoveContainer" containerID="cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535" Feb 01 19:38:08 crc kubenswrapper[4961]: E0201 19:38:08.203184 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\": container with ID starting with cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535 not found: ID does not exist" containerID="cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.203224 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535"} err="failed to get container status \"cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\": rpc error: code = NotFound desc = could not find container \"cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535\": container with ID starting with cfa9ff9029c2dddab98a55bd5e151410ac92e07b500595ab0e18be3698adb535 not found: ID does not exist" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.203253 4961 scope.go:117] "RemoveContainer" containerID="6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a" Feb 01 19:38:08 crc kubenswrapper[4961]: E0201 19:38:08.203500 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\": container with ID starting with 6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a not found: ID does not exist" containerID="6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.203538 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a"} err="failed to get container status \"6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\": rpc error: code = NotFound desc = could not find container \"6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a\": container with ID starting with 6a18ff6d33329235848f07118995ff94590a40bcea9758569a197430e0c1701a not found: ID does not exist" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.203561 4961 scope.go:117] "RemoveContainer" containerID="501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4" Feb 01 19:38:08 crc kubenswrapper[4961]: E0201 19:38:08.203981 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\": container with ID starting with 501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4 not found: ID does not exist" containerID="501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.204012 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4"} err="failed to get container status \"501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\": rpc error: code = NotFound desc = could not find container \"501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4\": container with ID starting with 501f22482bc11a680d0133d919da0a7e4f417a79cd646a5357a695580aaaa9b4 not found: ID does not exist" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.204029 4961 scope.go:117] "RemoveContainer" containerID="b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b" Feb 01 19:38:08 crc kubenswrapper[4961]: E0201 19:38:08.204948 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\": container with ID starting with b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b not found: ID does not exist" containerID="b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.204979 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b"} err="failed to get container status \"b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\": rpc error: code = NotFound desc = could not find container \"b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b\": container with ID starting with b1513027df1ae73e4bd7ef9b65445deb71064b7b479038a3116a6e12bb512b2b not found: ID does not exist" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.204998 4961 scope.go:117] "RemoveContainer" containerID="229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89" Feb 01 19:38:08 crc kubenswrapper[4961]: E0201 19:38:08.205295 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\": container with ID starting with 229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89 not found: ID does not exist" containerID="229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.205334 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89"} err="failed to get container status \"229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\": rpc error: code = NotFound desc = could not find container \"229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89\": container with ID starting with 229979c2546a800661e02d27a1d5e11e8db89e2316c6a0994862119ac4e2df89 not found: ID does not exist" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.205351 4961 scope.go:117] "RemoveContainer" containerID="98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de" Feb 01 19:38:08 crc kubenswrapper[4961]: E0201 19:38:08.205591 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\": container with ID starting with 98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de not found: ID does not exist" containerID="98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.205622 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de"} err="failed to get container status \"98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\": rpc error: code = NotFound desc = could not find container \"98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de\": container with ID starting with 98e0d2ad18c5328941b164b72eabfc9e093e2d4c5dc6ddd30478797d7f0b11de not found: ID does not exist" Feb 01 19:38:08 crc kubenswrapper[4961]: I0201 19:38:08.248262 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 01 19:38:09 crc kubenswrapper[4961]: E0201 19:38:09.405546 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="3.2s" Feb 01 19:38:12 crc kubenswrapper[4961]: E0201 19:38:12.607624 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="6.4s" Feb 01 19:38:16 crc kubenswrapper[4961]: I0201 19:38:16.239550 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:16 crc kubenswrapper[4961]: E0201 19:38:16.327693 4961 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.119:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.18903699527f5d0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-01 19:38:05.698579727 +0000 UTC m=+240.223611151,LastTimestamp:2026-02-01 19:38:05.698579727 +0000 UTC m=+240.223611151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 01 19:38:18 crc kubenswrapper[4961]: I0201 19:38:18.106524 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 01 19:38:18 crc kubenswrapper[4961]: I0201 19:38:18.106625 4961 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538" exitCode=1 Feb 01 19:38:18 crc kubenswrapper[4961]: I0201 19:38:18.106676 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538"} Feb 01 19:38:18 crc kubenswrapper[4961]: I0201 19:38:18.107579 4961 scope.go:117] "RemoveContainer" containerID="0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538" Feb 01 19:38:18 crc kubenswrapper[4961]: I0201 19:38:18.107966 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:18 crc kubenswrapper[4961]: I0201 19:38:18.110375 4961 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:19 crc kubenswrapper[4961]: E0201 19:38:19.009408 4961 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="7s" Feb 01 19:38:19 crc kubenswrapper[4961]: I0201 19:38:19.121555 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 01 19:38:19 crc kubenswrapper[4961]: I0201 19:38:19.121620 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7e794c9a42ec96d1bc4863a77315cf0c12c53ccdf012199695c47ccf6e654afc"} Feb 01 19:38:19 crc kubenswrapper[4961]: I0201 19:38:19.122487 4961 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:19 crc kubenswrapper[4961]: I0201 19:38:19.123099 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:19 crc kubenswrapper[4961]: I0201 19:38:19.234725 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:19 crc kubenswrapper[4961]: I0201 19:38:19.236120 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:19 crc kubenswrapper[4961]: I0201 19:38:19.236668 4961 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:19 crc kubenswrapper[4961]: I0201 19:38:19.256636 4961 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b47b7336-2148-44e9-83c1-582d0cd0d16f" Feb 01 19:38:19 crc kubenswrapper[4961]: I0201 19:38:19.256675 4961 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b47b7336-2148-44e9-83c1-582d0cd0d16f" Feb 01 19:38:19 crc kubenswrapper[4961]: E0201 19:38:19.257246 4961 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:19 crc kubenswrapper[4961]: I0201 19:38:19.258013 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:19 crc kubenswrapper[4961]: W0201 19:38:19.283549 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-67a65aac77ccb9ef35356e5817e4a9c2b3e7df31a2a03aa9a426dcf3e33d95fb WatchSource:0}: Error finding container 67a65aac77ccb9ef35356e5817e4a9c2b3e7df31a2a03aa9a426dcf3e33d95fb: Status 404 returned error can't find the container with id 67a65aac77ccb9ef35356e5817e4a9c2b3e7df31a2a03aa9a426dcf3e33d95fb Feb 01 19:38:20 crc kubenswrapper[4961]: I0201 19:38:20.128319 4961 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ff1e17ee5d4a75a25ccfa7a6e8e98763c41c496f1bd454a36d4b7d7312ab3778" exitCode=0 Feb 01 19:38:20 crc kubenswrapper[4961]: I0201 19:38:20.128420 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ff1e17ee5d4a75a25ccfa7a6e8e98763c41c496f1bd454a36d4b7d7312ab3778"} Feb 01 19:38:20 crc kubenswrapper[4961]: I0201 19:38:20.128469 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"67a65aac77ccb9ef35356e5817e4a9c2b3e7df31a2a03aa9a426dcf3e33d95fb"} Feb 01 19:38:20 crc kubenswrapper[4961]: I0201 19:38:20.128770 4961 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b47b7336-2148-44e9-83c1-582d0cd0d16f" Feb 01 19:38:20 crc kubenswrapper[4961]: I0201 19:38:20.128786 4961 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b47b7336-2148-44e9-83c1-582d0cd0d16f" Feb 01 19:38:20 crc kubenswrapper[4961]: I0201 19:38:20.129308 4961 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:20 crc kubenswrapper[4961]: E0201 19:38:20.129347 4961 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:20 crc kubenswrapper[4961]: I0201 19:38:20.129649 4961 status_manager.go:851] "Failed to get status for pod" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Feb 01 19:38:21 crc kubenswrapper[4961]: I0201 19:38:21.141866 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6cbc8db6355be4d758a59e7d18b1022e19a31947121b7083e4135ccb1aa1261a"} Feb 01 19:38:21 crc kubenswrapper[4961]: I0201 19:38:21.142127 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3e0afc2948c145c1a14b4fc8cc83040071b09f568d0aee6eb28204f228a1610c"} Feb 01 19:38:21 crc kubenswrapper[4961]: I0201 19:38:21.142137 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"20df33eecac1f83bfa41c632f3d950df7ee9ab9a799d156fbb2ff624023ad831"} Feb 01 19:38:21 crc kubenswrapper[4961]: I0201 19:38:21.142149 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e7a53ea817a997a92129272573bf2a0043a24b91b79a54e803564b51953b7500"} Feb 01 19:38:22 crc kubenswrapper[4961]: I0201 19:38:22.151202 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2614f3b58659d43bb46f064d51c9e16e6646270f5cdbd31386e2c89e62e706e5"} Feb 01 19:38:22 crc kubenswrapper[4961]: I0201 19:38:22.151506 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:22 crc kubenswrapper[4961]: I0201 19:38:22.151411 4961 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b47b7336-2148-44e9-83c1-582d0cd0d16f" Feb 01 19:38:22 crc kubenswrapper[4961]: I0201 19:38:22.151532 4961 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b47b7336-2148-44e9-83c1-582d0cd0d16f" Feb 01 19:38:23 crc kubenswrapper[4961]: I0201 19:38:23.014167 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 19:38:24 crc kubenswrapper[4961]: I0201 19:38:24.258196 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:24 crc kubenswrapper[4961]: I0201 19:38:24.258499 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:24 crc kubenswrapper[4961]: I0201 19:38:24.262806 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:24 crc kubenswrapper[4961]: I0201 19:38:24.361085 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 19:38:24 crc kubenswrapper[4961]: I0201 19:38:24.361297 4961 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 01 19:38:24 crc kubenswrapper[4961]: I0201 19:38:24.361343 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 01 19:38:27 crc kubenswrapper[4961]: I0201 19:38:27.158735 4961 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:27 crc kubenswrapper[4961]: I0201 19:38:27.179918 4961 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b47b7336-2148-44e9-83c1-582d0cd0d16f" Feb 01 19:38:27 crc kubenswrapper[4961]: I0201 19:38:27.179954 4961 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b47b7336-2148-44e9-83c1-582d0cd0d16f" Feb 01 19:38:27 crc kubenswrapper[4961]: I0201 19:38:27.183927 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:27 crc kubenswrapper[4961]: I0201 19:38:27.202480 4961 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4730064b-a8ca-43e2-9c7e-66adbdb9bf84" Feb 01 19:38:28 crc kubenswrapper[4961]: I0201 19:38:28.186012 4961 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b47b7336-2148-44e9-83c1-582d0cd0d16f" Feb 01 19:38:28 crc kubenswrapper[4961]: I0201 19:38:28.186130 4961 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="b47b7336-2148-44e9-83c1-582d0cd0d16f" Feb 01 19:38:28 crc kubenswrapper[4961]: I0201 19:38:28.189472 4961 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4730064b-a8ca-43e2-9c7e-66adbdb9bf84" Feb 01 19:38:34 crc kubenswrapper[4961]: I0201 19:38:34.361814 4961 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 01 19:38:34 crc kubenswrapper[4961]: I0201 19:38:34.362502 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 01 19:38:36 crc kubenswrapper[4961]: I0201 19:38:36.791677 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 01 19:38:37 crc kubenswrapper[4961]: I0201 19:38:37.417424 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 01 19:38:37 crc kubenswrapper[4961]: I0201 19:38:37.501440 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 01 19:38:37 crc kubenswrapper[4961]: I0201 19:38:37.585131 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 01 19:38:37 crc kubenswrapper[4961]: I0201 19:38:37.619243 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 01 19:38:37 crc kubenswrapper[4961]: I0201 19:38:37.852549 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 01 19:38:37 crc kubenswrapper[4961]: I0201 19:38:37.905802 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 01 19:38:37 crc kubenswrapper[4961]: I0201 19:38:37.987124 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 01 19:38:38 crc kubenswrapper[4961]: I0201 19:38:38.198304 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 01 19:38:38 crc kubenswrapper[4961]: I0201 19:38:38.223897 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 01 19:38:38 crc kubenswrapper[4961]: I0201 19:38:38.257849 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 01 19:38:38 crc kubenswrapper[4961]: I0201 19:38:38.293693 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 01 19:38:38 crc kubenswrapper[4961]: I0201 19:38:38.661709 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 01 19:38:38 crc kubenswrapper[4961]: I0201 19:38:38.778130 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 01 19:38:38 crc kubenswrapper[4961]: I0201 19:38:38.787013 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 01 19:38:38 crc kubenswrapper[4961]: I0201 19:38:38.967930 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 01 19:38:39 crc kubenswrapper[4961]: I0201 19:38:39.025644 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 01 19:38:39 crc kubenswrapper[4961]: I0201 19:38:39.110974 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 01 19:38:39 crc kubenswrapper[4961]: I0201 19:38:39.133726 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 19:38:39 crc kubenswrapper[4961]: I0201 19:38:39.152640 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 01 19:38:39 crc kubenswrapper[4961]: I0201 19:38:39.156562 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 01 19:38:39 crc kubenswrapper[4961]: I0201 19:38:39.211230 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 01 19:38:39 crc kubenswrapper[4961]: I0201 19:38:39.358186 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 01 19:38:39 crc kubenswrapper[4961]: I0201 19:38:39.474999 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 01 19:38:39 crc kubenswrapper[4961]: I0201 19:38:39.495308 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 01 19:38:39 crc kubenswrapper[4961]: I0201 19:38:39.841710 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.050697 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.133876 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.146387 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.148224 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.195854 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.274828 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.364190 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.402899 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.508492 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.523928 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.575071 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.651729 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.739384 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.876496 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 01 19:38:40 crc kubenswrapper[4961]: I0201 19:38:40.939114 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.039438 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.043365 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.050971 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.191188 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.251540 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.262758 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.335448 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.430923 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.440736 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.468610 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.599885 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.613436 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.678607 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.792680 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.812474 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.925075 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.978526 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 01 19:38:41 crc kubenswrapper[4961]: I0201 19:38:41.986616 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.002898 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.003115 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.110433 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.140241 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.264481 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.432886 4961 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.548684 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.550925 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.557393 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.590644 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.615136 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.632513 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.698700 4961 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.939075 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.991027 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 01 19:38:42 crc kubenswrapper[4961]: I0201 19:38:42.997151 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.078615 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.091072 4961 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.093477 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.126502 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.216798 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.254332 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.257138 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.303134 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.306344 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.322096 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.353369 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.427736 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.471103 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.484266 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.499819 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.653071 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.727073 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.788550 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.794556 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.803117 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.898986 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 01 19:38:43 crc kubenswrapper[4961]: I0201 19:38:43.918802 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.011376 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.190150 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.191131 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.214768 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.275418 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.286652 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.288595 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.301409 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.307250 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.312513 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.329129 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.361156 4961 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.361244 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.361325 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.362426 4961 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"7e794c9a42ec96d1bc4863a77315cf0c12c53ccdf012199695c47ccf6e654afc"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.362675 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://7e794c9a42ec96d1bc4863a77315cf0c12c53ccdf012199695c47ccf6e654afc" gracePeriod=30 Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.368166 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.440345 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.456523 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.457447 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.544175 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.569612 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.587888 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.665470 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.694216 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.700648 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.737119 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.742512 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.791156 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.949867 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 01 19:38:44 crc kubenswrapper[4961]: I0201 19:38:44.957133 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.100380 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.131813 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.244306 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.282501 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.346715 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.547574 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.628416 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.715453 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.715692 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.740880 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.783750 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.896144 4961 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.911899 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.929298 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.961613 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 01 19:38:45 crc kubenswrapper[4961]: I0201 19:38:45.970536 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.050559 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.183128 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.183725 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.201724 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.225319 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.252712 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.296534 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.313834 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.317620 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.351206 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.375754 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.404784 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.470402 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.501284 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.534157 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.572520 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.645985 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.659110 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.697821 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.703532 4961 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.775769 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.785855 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.946456 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 01 19:38:46 crc kubenswrapper[4961]: I0201 19:38:46.954121 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.003308 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.089973 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.112289 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.117894 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.155650 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.195534 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.275905 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.341239 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.342674 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.451400 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.553360 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.591939 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.737396 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.774279 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.850193 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.958358 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 01 19:38:47 crc kubenswrapper[4961]: I0201 19:38:47.963972 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.005317 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.039008 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.104403 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.124216 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.125007 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.175385 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.298802 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.313328 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.326173 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.332141 4961 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.336238 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.336289 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.340577 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.356250 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.356232971 podStartE2EDuration="21.356232971s" podCreationTimestamp="2026-02-01 19:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:38:48.355887882 +0000 UTC m=+282.880919305" watchObservedRunningTime="2026-02-01 19:38:48.356232971 +0000 UTC m=+282.881264394" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.441421 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.445001 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.551755 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.593462 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.642953 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.644789 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.655138 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.659339 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.679447 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.780413 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.798899 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.890201 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 01 19:38:48 crc kubenswrapper[4961]: I0201 19:38:48.959308 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.073066 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.108955 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.112390 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.145996 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.215055 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.227800 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.301330 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.302373 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.314256 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.397697 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.471086 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.487725 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.619651 4961 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.619850 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07" gracePeriod=5 Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.629464 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.866549 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.939490 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 01 19:38:49 crc kubenswrapper[4961]: I0201 19:38:49.945780 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.158645 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.254713 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.270222 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.284384 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.285633 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.302439 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.351948 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.518716 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.532475 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.564547 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.567824 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.579022 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.662881 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.690312 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 01 19:38:50 crc kubenswrapper[4961]: I0201 19:38:50.854069 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.012237 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.062652 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.074998 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7kjfx"] Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.075567 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7kjfx" podUID="966c0385-f333-4a81-972f-9e5569c11e11" containerName="registry-server" containerID="cri-o://cc0a2d8045c1c17eaf025d1b206664559c23b3fde3a5287324cbfc61e865f4ba" gracePeriod=30 Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.090283 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rldqh"] Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.090678 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rldqh" podUID="dbc19872-375c-4adc-88ca-bd7e84e57f63" containerName="registry-server" containerID="cri-o://e6b02cce14f6d2fd56d1cb3c4d6532a51f0511fc38d04c848957503b5392a463" gracePeriod=30 Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.102623 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rk9ts"] Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.103172 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" podUID="6198133a-f3a2-4b73-b4f7-103c5da5e684" containerName="marketplace-operator" containerID="cri-o://42acd8df684667178a16f73d4d7f3611feddbec330a529ff052adcf8fa6e7550" gracePeriod=30 Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.107132 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fm57v"] Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.107708 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fm57v" podUID="df83dd41-a7f5-4daf-8d82-43067543488a" containerName="registry-server" containerID="cri-o://33c48b3475cb12d9262847285cdb225e0968ec7ffa84a43e90d1e1af83a0e5f2" gracePeriod=30 Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.115410 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlbw2"] Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.116834 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jlbw2" podUID="57b71857-2f5b-42af-be87-47936a224fe0" containerName="registry-server" containerID="cri-o://a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc" gracePeriod=30 Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.168943 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.271402 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.302742 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.315909 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 01 19:38:51 crc kubenswrapper[4961]: E0201 19:38:51.341148 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc is running failed: container process not found" containerID="a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.341353 4961 generic.go:334] "Generic (PLEG): container finished" podID="dbc19872-375c-4adc-88ca-bd7e84e57f63" containerID="e6b02cce14f6d2fd56d1cb3c4d6532a51f0511fc38d04c848957503b5392a463" exitCode=0 Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.341420 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rldqh" event={"ID":"dbc19872-375c-4adc-88ca-bd7e84e57f63","Type":"ContainerDied","Data":"e6b02cce14f6d2fd56d1cb3c4d6532a51f0511fc38d04c848957503b5392a463"} Feb 01 19:38:51 crc kubenswrapper[4961]: E0201 19:38:51.342201 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc is running failed: container process not found" containerID="a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 19:38:51 crc kubenswrapper[4961]: E0201 19:38:51.342900 4961 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc is running failed: container process not found" containerID="a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc" cmd=["grpc_health_probe","-addr=:50051"] Feb 01 19:38:51 crc kubenswrapper[4961]: E0201 19:38:51.342932 4961 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-jlbw2" podUID="57b71857-2f5b-42af-be87-47936a224fe0" containerName="registry-server" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.345943 4961 generic.go:334] "Generic (PLEG): container finished" podID="966c0385-f333-4a81-972f-9e5569c11e11" containerID="cc0a2d8045c1c17eaf025d1b206664559c23b3fde3a5287324cbfc61e865f4ba" exitCode=0 Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.346003 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjfx" event={"ID":"966c0385-f333-4a81-972f-9e5569c11e11","Type":"ContainerDied","Data":"cc0a2d8045c1c17eaf025d1b206664559c23b3fde3a5287324cbfc61e865f4ba"} Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.349532 4961 generic.go:334] "Generic (PLEG): container finished" podID="df83dd41-a7f5-4daf-8d82-43067543488a" containerID="33c48b3475cb12d9262847285cdb225e0968ec7ffa84a43e90d1e1af83a0e5f2" exitCode=0 Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.349577 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fm57v" event={"ID":"df83dd41-a7f5-4daf-8d82-43067543488a","Type":"ContainerDied","Data":"33c48b3475cb12d9262847285cdb225e0968ec7ffa84a43e90d1e1af83a0e5f2"} Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.351210 4961 generic.go:334] "Generic (PLEG): container finished" podID="6198133a-f3a2-4b73-b4f7-103c5da5e684" containerID="42acd8df684667178a16f73d4d7f3611feddbec330a529ff052adcf8fa6e7550" exitCode=0 Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.351285 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" event={"ID":"6198133a-f3a2-4b73-b4f7-103c5da5e684","Type":"ContainerDied","Data":"42acd8df684667178a16f73d4d7f3611feddbec330a529ff052adcf8fa6e7550"} Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.353151 4961 generic.go:334] "Generic (PLEG): container finished" podID="57b71857-2f5b-42af-be87-47936a224fe0" containerID="a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc" exitCode=0 Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.353178 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbw2" event={"ID":"57b71857-2f5b-42af-be87-47936a224fe0","Type":"ContainerDied","Data":"a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc"} Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.500094 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.552480 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.622227 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.622335 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.633064 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.637705 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.648698 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.698370 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-catalog-content\") pod \"966c0385-f333-4a81-972f-9e5569c11e11\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.698444 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-utilities\") pod \"966c0385-f333-4a81-972f-9e5569c11e11\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.698517 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwt76\" (UniqueName: \"kubernetes.io/projected/966c0385-f333-4a81-972f-9e5569c11e11-kube-api-access-nwt76\") pod \"966c0385-f333-4a81-972f-9e5569c11e11\" (UID: \"966c0385-f333-4a81-972f-9e5569c11e11\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.703283 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966c0385-f333-4a81-972f-9e5569c11e11-kube-api-access-nwt76" (OuterVolumeSpecName: "kube-api-access-nwt76") pod "966c0385-f333-4a81-972f-9e5569c11e11" (UID: "966c0385-f333-4a81-972f-9e5569c11e11"). InnerVolumeSpecName "kube-api-access-nwt76". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.705284 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-utilities" (OuterVolumeSpecName: "utilities") pod "966c0385-f333-4a81-972f-9e5569c11e11" (UID: "966c0385-f333-4a81-972f-9e5569c11e11"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.745628 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "966c0385-f333-4a81-972f-9e5569c11e11" (UID: "966c0385-f333-4a81-972f-9e5569c11e11"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800026 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-catalog-content\") pod \"57b71857-2f5b-42af-be87-47936a224fe0\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800143 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-catalog-content\") pod \"dbc19872-375c-4adc-88ca-bd7e84e57f63\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800220 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pzpj\" (UniqueName: \"kubernetes.io/projected/57b71857-2f5b-42af-be87-47936a224fe0-kube-api-access-9pzpj\") pod \"57b71857-2f5b-42af-be87-47936a224fe0\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800247 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-catalog-content\") pod \"df83dd41-a7f5-4daf-8d82-43067543488a\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800273 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-trusted-ca\") pod \"6198133a-f3a2-4b73-b4f7-103c5da5e684\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800318 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgxrv\" (UniqueName: \"kubernetes.io/projected/6198133a-f3a2-4b73-b4f7-103c5da5e684-kube-api-access-dgxrv\") pod \"6198133a-f3a2-4b73-b4f7-103c5da5e684\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800340 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t9f9\" (UniqueName: \"kubernetes.io/projected/dbc19872-375c-4adc-88ca-bd7e84e57f63-kube-api-access-9t9f9\") pod \"dbc19872-375c-4adc-88ca-bd7e84e57f63\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800382 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p4rj\" (UniqueName: \"kubernetes.io/projected/df83dd41-a7f5-4daf-8d82-43067543488a-kube-api-access-6p4rj\") pod \"df83dd41-a7f5-4daf-8d82-43067543488a\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800406 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-operator-metrics\") pod \"6198133a-f3a2-4b73-b4f7-103c5da5e684\" (UID: \"6198133a-f3a2-4b73-b4f7-103c5da5e684\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800434 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-utilities\") pod \"57b71857-2f5b-42af-be87-47936a224fe0\" (UID: \"57b71857-2f5b-42af-be87-47936a224fe0\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800479 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-utilities\") pod \"dbc19872-375c-4adc-88ca-bd7e84e57f63\" (UID: \"dbc19872-375c-4adc-88ca-bd7e84e57f63\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800496 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-utilities\") pod \"df83dd41-a7f5-4daf-8d82-43067543488a\" (UID: \"df83dd41-a7f5-4daf-8d82-43067543488a\") " Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800743 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800761 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966c0385-f333-4a81-972f-9e5569c11e11-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.800771 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwt76\" (UniqueName: \"kubernetes.io/projected/966c0385-f333-4a81-972f-9e5569c11e11-kube-api-access-nwt76\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.801566 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-utilities" (OuterVolumeSpecName: "utilities") pod "df83dd41-a7f5-4daf-8d82-43067543488a" (UID: "df83dd41-a7f5-4daf-8d82-43067543488a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.801770 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6198133a-f3a2-4b73-b4f7-103c5da5e684" (UID: "6198133a-f3a2-4b73-b4f7-103c5da5e684"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.802747 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-utilities" (OuterVolumeSpecName: "utilities") pod "dbc19872-375c-4adc-88ca-bd7e84e57f63" (UID: "dbc19872-375c-4adc-88ca-bd7e84e57f63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.802989 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc19872-375c-4adc-88ca-bd7e84e57f63-kube-api-access-9t9f9" (OuterVolumeSpecName: "kube-api-access-9t9f9") pod "dbc19872-375c-4adc-88ca-bd7e84e57f63" (UID: "dbc19872-375c-4adc-88ca-bd7e84e57f63"). InnerVolumeSpecName "kube-api-access-9t9f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.803468 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b71857-2f5b-42af-be87-47936a224fe0-kube-api-access-9pzpj" (OuterVolumeSpecName: "kube-api-access-9pzpj") pod "57b71857-2f5b-42af-be87-47936a224fe0" (UID: "57b71857-2f5b-42af-be87-47936a224fe0"). InnerVolumeSpecName "kube-api-access-9pzpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.805146 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df83dd41-a7f5-4daf-8d82-43067543488a-kube-api-access-6p4rj" (OuterVolumeSpecName: "kube-api-access-6p4rj") pod "df83dd41-a7f5-4daf-8d82-43067543488a" (UID: "df83dd41-a7f5-4daf-8d82-43067543488a"). InnerVolumeSpecName "kube-api-access-6p4rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.805209 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-utilities" (OuterVolumeSpecName: "utilities") pod "57b71857-2f5b-42af-be87-47936a224fe0" (UID: "57b71857-2f5b-42af-be87-47936a224fe0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.805414 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6198133a-f3a2-4b73-b4f7-103c5da5e684-kube-api-access-dgxrv" (OuterVolumeSpecName: "kube-api-access-dgxrv") pod "6198133a-f3a2-4b73-b4f7-103c5da5e684" (UID: "6198133a-f3a2-4b73-b4f7-103c5da5e684"). InnerVolumeSpecName "kube-api-access-dgxrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.805450 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6198133a-f3a2-4b73-b4f7-103c5da5e684" (UID: "6198133a-f3a2-4b73-b4f7-103c5da5e684"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.823579 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df83dd41-a7f5-4daf-8d82-43067543488a" (UID: "df83dd41-a7f5-4daf-8d82-43067543488a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.835964 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.862663 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbc19872-375c-4adc-88ca-bd7e84e57f63" (UID: "dbc19872-375c-4adc-88ca-bd7e84e57f63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.901582 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pzpj\" (UniqueName: \"kubernetes.io/projected/57b71857-2f5b-42af-be87-47936a224fe0-kube-api-access-9pzpj\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.901621 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.901635 4961 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.901651 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgxrv\" (UniqueName: \"kubernetes.io/projected/6198133a-f3a2-4b73-b4f7-103c5da5e684-kube-api-access-dgxrv\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.901667 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t9f9\" (UniqueName: \"kubernetes.io/projected/dbc19872-375c-4adc-88ca-bd7e84e57f63-kube-api-access-9t9f9\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.901678 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p4rj\" (UniqueName: \"kubernetes.io/projected/df83dd41-a7f5-4daf-8d82-43067543488a-kube-api-access-6p4rj\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.901690 4961 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6198133a-f3a2-4b73-b4f7-103c5da5e684-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.901702 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.901713 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.901725 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df83dd41-a7f5-4daf-8d82-43067543488a-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.901735 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbc19872-375c-4adc-88ca-bd7e84e57f63-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.923346 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57b71857-2f5b-42af-be87-47936a224fe0" (UID: "57b71857-2f5b-42af-be87-47936a224fe0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:38:51 crc kubenswrapper[4961]: I0201 19:38:51.950486 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.002853 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57b71857-2f5b-42af-be87-47936a224fe0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.190291 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.358669 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.359699 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rk9ts" event={"ID":"6198133a-f3a2-4b73-b4f7-103c5da5e684","Type":"ContainerDied","Data":"84eec27ac48b95a31532f8d0659fccd7b0dda06951be74164e18a482179b007a"} Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.359768 4961 scope.go:117] "RemoveContainer" containerID="42acd8df684667178a16f73d4d7f3611feddbec330a529ff052adcf8fa6e7550" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.364661 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlbw2" event={"ID":"57b71857-2f5b-42af-be87-47936a224fe0","Type":"ContainerDied","Data":"4227206e4a1b5a68854550b375c1909eb131e9364f2fe7b4e6b74bd8570fec0f"} Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.364531 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlbw2" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.367381 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rldqh" event={"ID":"dbc19872-375c-4adc-88ca-bd7e84e57f63","Type":"ContainerDied","Data":"cb39a782bb3e03b206dac93227f7d0e35df9892f60ea87e8bc0c332390345371"} Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.368103 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rldqh" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.378211 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7kjfx" event={"ID":"966c0385-f333-4a81-972f-9e5569c11e11","Type":"ContainerDied","Data":"1dec2a4f65bc5d2c4b54164a9aecb94d9fffe3ba166868badd650cab5876a66b"} Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.378341 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7kjfx" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.382803 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fm57v" event={"ID":"df83dd41-a7f5-4daf-8d82-43067543488a","Type":"ContainerDied","Data":"82b0811975bf4c358981263bb6f72cbc5dea142daa4e1d57ba9dd0ff9e9b4d9b"} Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.382877 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fm57v" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.385963 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rk9ts"] Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.391404 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rk9ts"] Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.393163 4961 scope.go:117] "RemoveContainer" containerID="a20ecbe51ef19b0dc8d03cc991f0344a0e9eceb358c72866759dcfd4555d29bc" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.409848 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlbw2"] Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.424264 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jlbw2"] Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.424311 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7kjfx"] Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.425160 4961 scope.go:117] "RemoveContainer" containerID="c0a09011249958fcf57f61c4eab77fcaf2b11f24d9cfe978faa816f855004522" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.430778 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7kjfx"] Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.445467 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rldqh"] Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.445578 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rldqh"] Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.447139 4961 scope.go:117] "RemoveContainer" containerID="65473cb7747e3058137bc527ccafa0c8d032baaf3c5ba64d354bfd639270ee96" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.447304 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fm57v"] Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.449700 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fm57v"] Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.462878 4961 scope.go:117] "RemoveContainer" containerID="e6b02cce14f6d2fd56d1cb3c4d6532a51f0511fc38d04c848957503b5392a463" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.480120 4961 scope.go:117] "RemoveContainer" containerID="c602277d1088207d492427574abde8e58196d339d44e3fdf94775562232447b7" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.493947 4961 scope.go:117] "RemoveContainer" containerID="94a633e0f1897eeea94c87f2a6501e848a61fa31ddb906c27a93dc9508558e52" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.510242 4961 scope.go:117] "RemoveContainer" containerID="cc0a2d8045c1c17eaf025d1b206664559c23b3fde3a5287324cbfc61e865f4ba" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.531022 4961 scope.go:117] "RemoveContainer" containerID="5047e21e0dec0312120fb20c481363fc8c98a0cab2327d5ec2817a1df158aed1" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.544027 4961 scope.go:117] "RemoveContainer" containerID="1d870a45dfb2187fc623e94f4e05c781f66c32e64bc644f396cffcacd0f529fa" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.551512 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.556374 4961 scope.go:117] "RemoveContainer" containerID="33c48b3475cb12d9262847285cdb225e0968ec7ffa84a43e90d1e1af83a0e5f2" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.566717 4961 scope.go:117] "RemoveContainer" containerID="fffbbd1c868c9eef4e12bc21215b54299609a95e5c00251aecc1d84242a31ce2" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.579021 4961 scope.go:117] "RemoveContainer" containerID="c5147254c087559ef246c0f114896e355ed74aefc9d3055bbddc46b3656a6231" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.597772 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.653841 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 01 19:38:52 crc kubenswrapper[4961]: I0201 19:38:52.704551 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 01 19:38:53 crc kubenswrapper[4961]: I0201 19:38:53.074844 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 01 19:38:53 crc kubenswrapper[4961]: I0201 19:38:53.142463 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 01 19:38:53 crc kubenswrapper[4961]: I0201 19:38:53.230244 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 01 19:38:54 crc kubenswrapper[4961]: I0201 19:38:54.254425 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b71857-2f5b-42af-be87-47936a224fe0" path="/var/lib/kubelet/pods/57b71857-2f5b-42af-be87-47936a224fe0/volumes" Feb 01 19:38:54 crc kubenswrapper[4961]: I0201 19:38:54.255276 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6198133a-f3a2-4b73-b4f7-103c5da5e684" path="/var/lib/kubelet/pods/6198133a-f3a2-4b73-b4f7-103c5da5e684/volumes" Feb 01 19:38:54 crc kubenswrapper[4961]: I0201 19:38:54.255833 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="966c0385-f333-4a81-972f-9e5569c11e11" path="/var/lib/kubelet/pods/966c0385-f333-4a81-972f-9e5569c11e11/volumes" Feb 01 19:38:54 crc kubenswrapper[4961]: I0201 19:38:54.257010 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc19872-375c-4adc-88ca-bd7e84e57f63" path="/var/lib/kubelet/pods/dbc19872-375c-4adc-88ca-bd7e84e57f63/volumes" Feb 01 19:38:54 crc kubenswrapper[4961]: I0201 19:38:54.257579 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df83dd41-a7f5-4daf-8d82-43067543488a" path="/var/lib/kubelet/pods/df83dd41-a7f5-4daf-8d82-43067543488a/volumes" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.199603 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.200147 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.237589 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.237640 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.237745 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.237761 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.237785 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.237820 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.237824 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.237867 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.237963 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.238026 4961 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.238051 4961 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.238060 4961 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.238071 4961 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.252190 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.338906 4961 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.404170 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.404237 4961 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07" exitCode=137 Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.404301 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.404320 4961 scope.go:117] "RemoveContainer" containerID="42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.434126 4961 scope.go:117] "RemoveContainer" containerID="42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07" Feb 01 19:38:55 crc kubenswrapper[4961]: E0201 19:38:55.434666 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07\": container with ID starting with 42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07 not found: ID does not exist" containerID="42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07" Feb 01 19:38:55 crc kubenswrapper[4961]: I0201 19:38:55.434706 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07"} err="failed to get container status \"42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07\": rpc error: code = NotFound desc = could not find container \"42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07\": container with ID starting with 42c3a4e095cf94a5357691d66ffdc440e2f44304d680ae20198808059c6ccc07 not found: ID does not exist" Feb 01 19:38:56 crc kubenswrapper[4961]: I0201 19:38:56.257471 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 01 19:39:05 crc kubenswrapper[4961]: I0201 19:39:05.995401 4961 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 01 19:39:14 crc kubenswrapper[4961]: I0201 19:39:14.498990 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 01 19:39:14 crc kubenswrapper[4961]: I0201 19:39:14.502365 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 01 19:39:14 crc kubenswrapper[4961]: I0201 19:39:14.502418 4961 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="7e794c9a42ec96d1bc4863a77315cf0c12c53ccdf012199695c47ccf6e654afc" exitCode=137 Feb 01 19:39:14 crc kubenswrapper[4961]: I0201 19:39:14.502446 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"7e794c9a42ec96d1bc4863a77315cf0c12c53ccdf012199695c47ccf6e654afc"} Feb 01 19:39:14 crc kubenswrapper[4961]: I0201 19:39:14.502479 4961 scope.go:117] "RemoveContainer" containerID="0941196da3dc0d813ea145902204d1c397539328ee6144db1fe8100ef1f3e538" Feb 01 19:39:15 crc kubenswrapper[4961]: I0201 19:39:15.509637 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 01 19:39:15 crc kubenswrapper[4961]: I0201 19:39:15.511084 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5fda066c83f0417d0c519de6e1603ba7dab0c1366d2301972296779c60c21d8a"} Feb 01 19:39:23 crc kubenswrapper[4961]: I0201 19:39:23.014393 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 19:39:24 crc kubenswrapper[4961]: I0201 19:39:24.361017 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 19:39:24 crc kubenswrapper[4961]: I0201 19:39:24.364995 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.017753 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.861461 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nbv69"] Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.861675 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" podUID="613fa395-90dd-4c22-8326-57e4dcb5f79f" containerName="controller-manager" containerID="cri-o://28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9" gracePeriod=30 Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.926557 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gx8gj"] Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.926789 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df83dd41-a7f5-4daf-8d82-43067543488a" containerName="extract-content" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.926825 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="df83dd41-a7f5-4daf-8d82-43067543488a" containerName="extract-content" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.926835 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b71857-2f5b-42af-be87-47936a224fe0" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.926844 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b71857-2f5b-42af-be87-47936a224fe0" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.926854 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6198133a-f3a2-4b73-b4f7-103c5da5e684" containerName="marketplace-operator" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.926862 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="6198133a-f3a2-4b73-b4f7-103c5da5e684" containerName="marketplace-operator" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.926874 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc19872-375c-4adc-88ca-bd7e84e57f63" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.926881 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc19872-375c-4adc-88ca-bd7e84e57f63" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.926891 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966c0385-f333-4a81-972f-9e5569c11e11" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.926901 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="966c0385-f333-4a81-972f-9e5569c11e11" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.926913 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc19872-375c-4adc-88ca-bd7e84e57f63" containerName="extract-content" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.926919 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc19872-375c-4adc-88ca-bd7e84e57f63" containerName="extract-content" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.926930 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.926937 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.926949 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966c0385-f333-4a81-972f-9e5569c11e11" containerName="extract-content" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.926957 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="966c0385-f333-4a81-972f-9e5569c11e11" containerName="extract-content" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.926970 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b71857-2f5b-42af-be87-47936a224fe0" containerName="extract-content" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.926978 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b71857-2f5b-42af-be87-47936a224fe0" containerName="extract-content" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.926987 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b71857-2f5b-42af-be87-47936a224fe0" containerName="extract-utilities" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.926994 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b71857-2f5b-42af-be87-47936a224fe0" containerName="extract-utilities" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.927003 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" containerName="installer" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927010 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" containerName="installer" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.927022 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df83dd41-a7f5-4daf-8d82-43067543488a" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927030 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="df83dd41-a7f5-4daf-8d82-43067543488a" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.927039 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc19872-375c-4adc-88ca-bd7e84e57f63" containerName="extract-utilities" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927065 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc19872-375c-4adc-88ca-bd7e84e57f63" containerName="extract-utilities" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.927076 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966c0385-f333-4a81-972f-9e5569c11e11" containerName="extract-utilities" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927083 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="966c0385-f333-4a81-972f-9e5569c11e11" containerName="extract-utilities" Feb 01 19:39:33 crc kubenswrapper[4961]: E0201 19:39:33.927093 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df83dd41-a7f5-4daf-8d82-43067543488a" containerName="extract-utilities" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927100 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="df83dd41-a7f5-4daf-8d82-43067543488a" containerName="extract-utilities" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927206 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="df83dd41-a7f5-4daf-8d82-43067543488a" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927224 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="6198133a-f3a2-4b73-b4f7-103c5da5e684" containerName="marketplace-operator" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927236 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bc77e5d-a2c6-48a6-acc9-dc952b52c4be" containerName="installer" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927247 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b71857-2f5b-42af-be87-47936a224fe0" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927255 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927264 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="966c0385-f333-4a81-972f-9e5569c11e11" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927276 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc19872-375c-4adc-88ca-bd7e84e57f63" containerName="registry-server" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.927721 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.930483 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.930614 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.932012 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.932110 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.942355 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.946264 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gx8gj"] Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.950079 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp"] Feb 01 19:39:33 crc kubenswrapper[4961]: I0201 19:39:33.950272 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" podUID="adda0b5e-49cb-4a97-a76c-0f19caddf65a" containerName="route-controller-manager" containerID="cri-o://6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f" gracePeriod=30 Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.036723 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/280af251-125c-4b96-a327-239a67f4c474-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gx8gj\" (UID: \"280af251-125c-4b96-a327-239a67f4c474\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.037022 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/280af251-125c-4b96-a327-239a67f4c474-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gx8gj\" (UID: \"280af251-125c-4b96-a327-239a67f4c474\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.037098 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v54dj\" (UniqueName: \"kubernetes.io/projected/280af251-125c-4b96-a327-239a67f4c474-kube-api-access-v54dj\") pod \"marketplace-operator-79b997595-gx8gj\" (UID: \"280af251-125c-4b96-a327-239a67f4c474\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.139163 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v54dj\" (UniqueName: \"kubernetes.io/projected/280af251-125c-4b96-a327-239a67f4c474-kube-api-access-v54dj\") pod \"marketplace-operator-79b997595-gx8gj\" (UID: \"280af251-125c-4b96-a327-239a67f4c474\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.139251 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/280af251-125c-4b96-a327-239a67f4c474-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gx8gj\" (UID: \"280af251-125c-4b96-a327-239a67f4c474\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.139292 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/280af251-125c-4b96-a327-239a67f4c474-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gx8gj\" (UID: \"280af251-125c-4b96-a327-239a67f4c474\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.141128 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/280af251-125c-4b96-a327-239a67f4c474-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gx8gj\" (UID: \"280af251-125c-4b96-a327-239a67f4c474\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.150029 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/280af251-125c-4b96-a327-239a67f4c474-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gx8gj\" (UID: \"280af251-125c-4b96-a327-239a67f4c474\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.157989 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v54dj\" (UniqueName: \"kubernetes.io/projected/280af251-125c-4b96-a327-239a67f4c474-kube-api-access-v54dj\") pod \"marketplace-operator-79b997595-gx8gj\" (UID: \"280af251-125c-4b96-a327-239a67f4c474\") " pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.258479 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.268495 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.284547 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.442184 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adda0b5e-49cb-4a97-a76c-0f19caddf65a-serving-cert\") pod \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.442630 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-proxy-ca-bundles\") pod \"613fa395-90dd-4c22-8326-57e4dcb5f79f\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.443360 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-client-ca" (OuterVolumeSpecName: "client-ca") pod "adda0b5e-49cb-4a97-a76c-0f19caddf65a" (UID: "adda0b5e-49cb-4a97-a76c-0f19caddf65a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.443890 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "613fa395-90dd-4c22-8326-57e4dcb5f79f" (UID: "613fa395-90dd-4c22-8326-57e4dcb5f79f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.442667 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-client-ca\") pod \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.444012 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-config\") pod \"613fa395-90dd-4c22-8326-57e4dcb5f79f\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.444685 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-config" (OuterVolumeSpecName: "config") pod "613fa395-90dd-4c22-8326-57e4dcb5f79f" (UID: "613fa395-90dd-4c22-8326-57e4dcb5f79f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.444778 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gckss\" (UniqueName: \"kubernetes.io/projected/adda0b5e-49cb-4a97-a76c-0f19caddf65a-kube-api-access-gckss\") pod \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.444821 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/613fa395-90dd-4c22-8326-57e4dcb5f79f-serving-cert\") pod \"613fa395-90dd-4c22-8326-57e4dcb5f79f\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.444853 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-client-ca\") pod \"613fa395-90dd-4c22-8326-57e4dcb5f79f\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.444904 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-config\") pod \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\" (UID: \"adda0b5e-49cb-4a97-a76c-0f19caddf65a\") " Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.444940 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrmgl\" (UniqueName: \"kubernetes.io/projected/613fa395-90dd-4c22-8326-57e4dcb5f79f-kube-api-access-rrmgl\") pod \"613fa395-90dd-4c22-8326-57e4dcb5f79f\" (UID: \"613fa395-90dd-4c22-8326-57e4dcb5f79f\") " Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.445889 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-client-ca" (OuterVolumeSpecName: "client-ca") pod "613fa395-90dd-4c22-8326-57e4dcb5f79f" (UID: "613fa395-90dd-4c22-8326-57e4dcb5f79f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.445910 4961 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.445925 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.445934 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.447362 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-config" (OuterVolumeSpecName: "config") pod "adda0b5e-49cb-4a97-a76c-0f19caddf65a" (UID: "adda0b5e-49cb-4a97-a76c-0f19caddf65a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.450315 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adda0b5e-49cb-4a97-a76c-0f19caddf65a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "adda0b5e-49cb-4a97-a76c-0f19caddf65a" (UID: "adda0b5e-49cb-4a97-a76c-0f19caddf65a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.450376 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adda0b5e-49cb-4a97-a76c-0f19caddf65a-kube-api-access-gckss" (OuterVolumeSpecName: "kube-api-access-gckss") pod "adda0b5e-49cb-4a97-a76c-0f19caddf65a" (UID: "adda0b5e-49cb-4a97-a76c-0f19caddf65a"). InnerVolumeSpecName "kube-api-access-gckss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.450368 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/613fa395-90dd-4c22-8326-57e4dcb5f79f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "613fa395-90dd-4c22-8326-57e4dcb5f79f" (UID: "613fa395-90dd-4c22-8326-57e4dcb5f79f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.450429 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/613fa395-90dd-4c22-8326-57e4dcb5f79f-kube-api-access-rrmgl" (OuterVolumeSpecName: "kube-api-access-rrmgl") pod "613fa395-90dd-4c22-8326-57e4dcb5f79f" (UID: "613fa395-90dd-4c22-8326-57e4dcb5f79f"). InnerVolumeSpecName "kube-api-access-rrmgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.500519 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gx8gj"] Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.547008 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gckss\" (UniqueName: \"kubernetes.io/projected/adda0b5e-49cb-4a97-a76c-0f19caddf65a-kube-api-access-gckss\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.547069 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/613fa395-90dd-4c22-8326-57e4dcb5f79f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.547085 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/613fa395-90dd-4c22-8326-57e4dcb5f79f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.547097 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adda0b5e-49cb-4a97-a76c-0f19caddf65a-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.547108 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrmgl\" (UniqueName: \"kubernetes.io/projected/613fa395-90dd-4c22-8326-57e4dcb5f79f-kube-api-access-rrmgl\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.547121 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adda0b5e-49cb-4a97-a76c-0f19caddf65a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.618298 4961 generic.go:334] "Generic (PLEG): container finished" podID="613fa395-90dd-4c22-8326-57e4dcb5f79f" containerID="28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9" exitCode=0 Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.618366 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.618380 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" event={"ID":"613fa395-90dd-4c22-8326-57e4dcb5f79f","Type":"ContainerDied","Data":"28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9"} Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.618413 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-nbv69" event={"ID":"613fa395-90dd-4c22-8326-57e4dcb5f79f","Type":"ContainerDied","Data":"865a153eb7e03abe94d92983ca9c7a34434ad2f5264ea1d6ee10a8290be38442"} Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.618530 4961 scope.go:117] "RemoveContainer" containerID="28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.620009 4961 generic.go:334] "Generic (PLEG): container finished" podID="adda0b5e-49cb-4a97-a76c-0f19caddf65a" containerID="6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f" exitCode=0 Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.620079 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.620157 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" event={"ID":"adda0b5e-49cb-4a97-a76c-0f19caddf65a","Type":"ContainerDied","Data":"6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f"} Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.620206 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp" event={"ID":"adda0b5e-49cb-4a97-a76c-0f19caddf65a","Type":"ContainerDied","Data":"222f5be60ca0378c8ba260383564f4215134a68a6eb6b83a3d406ac2b7edf1d9"} Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.621593 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" event={"ID":"280af251-125c-4b96-a327-239a67f4c474","Type":"ContainerStarted","Data":"e7a6586eee2dad1db1b80d5b072206b87b1f6f05f95c592e2c9b48db7a43fab0"} Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.621626 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" event={"ID":"280af251-125c-4b96-a327-239a67f4c474","Type":"ContainerStarted","Data":"179b835671a6727cbb43e2bb156ca37de39d1639f9b990dafb42291d72c1c3c9"} Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.621940 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.624491 4961 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gx8gj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.624532 4961 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" podUID="280af251-125c-4b96-a327-239a67f4c474" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.635018 4961 scope.go:117] "RemoveContainer" containerID="28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9" Feb 01 19:39:34 crc kubenswrapper[4961]: E0201 19:39:34.635649 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9\": container with ID starting with 28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9 not found: ID does not exist" containerID="28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.635818 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9"} err="failed to get container status \"28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9\": rpc error: code = NotFound desc = could not find container \"28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9\": container with ID starting with 28ae1f7decb7facdc56991924b0e381ea56596a6db62e4c926f3d82bec1ec1e9 not found: ID does not exist" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.635937 4961 scope.go:117] "RemoveContainer" containerID="6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.663325 4961 scope.go:117] "RemoveContainer" containerID="6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f" Feb 01 19:39:34 crc kubenswrapper[4961]: E0201 19:39:34.664815 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f\": container with ID starting with 6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f not found: ID does not exist" containerID="6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.664869 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f"} err="failed to get container status \"6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f\": rpc error: code = NotFound desc = could not find container \"6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f\": container with ID starting with 6e77e37f78e475fa430b1adbe78d877158133bfafe3f3286658623f94780466f not found: ID does not exist" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.682688 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" podStartSLOduration=1.682668932 podStartE2EDuration="1.682668932s" podCreationTimestamp="2026-02-01 19:39:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:39:34.645494777 +0000 UTC m=+329.170526210" watchObservedRunningTime="2026-02-01 19:39:34.682668932 +0000 UTC m=+329.207700355" Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.683564 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nbv69"] Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.686915 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-nbv69"] Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.689925 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp"] Feb 01 19:39:34 crc kubenswrapper[4961]: I0201 19:39:34.692930 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-9qwhp"] Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.629386 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gx8gj" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.866821 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fd9994dbf-wgbx2"] Feb 01 19:39:35 crc kubenswrapper[4961]: E0201 19:39:35.867076 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adda0b5e-49cb-4a97-a76c-0f19caddf65a" containerName="route-controller-manager" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.867090 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="adda0b5e-49cb-4a97-a76c-0f19caddf65a" containerName="route-controller-manager" Feb 01 19:39:35 crc kubenswrapper[4961]: E0201 19:39:35.867100 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613fa395-90dd-4c22-8326-57e4dcb5f79f" containerName="controller-manager" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.867107 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="613fa395-90dd-4c22-8326-57e4dcb5f79f" containerName="controller-manager" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.867209 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="adda0b5e-49cb-4a97-a76c-0f19caddf65a" containerName="route-controller-manager" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.867222 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="613fa395-90dd-4c22-8326-57e4dcb5f79f" containerName="controller-manager" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.867564 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.869547 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.870345 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt"] Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.870975 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.872687 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.873008 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.873347 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.874223 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.874342 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.874549 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.876039 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.876921 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.877695 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.877999 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.878272 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.887246 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fd9994dbf-wgbx2"] Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.890457 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt"] Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.895860 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.963685 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ce02d23-60c3-4015-9f92-10c40fe298b2-config\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.963761 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-config\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.963792 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxqw9\" (UniqueName: \"kubernetes.io/projected/1ce02d23-60c3-4015-9f92-10c40fe298b2-kube-api-access-fxqw9\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.963810 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-serving-cert\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.963828 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-client-ca\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.963851 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plff6\" (UniqueName: \"kubernetes.io/projected/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-kube-api-access-plff6\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.963865 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce02d23-60c3-4015-9f92-10c40fe298b2-client-ca\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.963880 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ce02d23-60c3-4015-9f92-10c40fe298b2-proxy-ca-bundles\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:35 crc kubenswrapper[4961]: I0201 19:39:35.963894 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce02d23-60c3-4015-9f92-10c40fe298b2-serving-cert\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.064640 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ce02d23-60c3-4015-9f92-10c40fe298b2-config\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.065034 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-config\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.065244 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxqw9\" (UniqueName: \"kubernetes.io/projected/1ce02d23-60c3-4015-9f92-10c40fe298b2-kube-api-access-fxqw9\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.065375 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-serving-cert\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.065486 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-client-ca\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.065625 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plff6\" (UniqueName: \"kubernetes.io/projected/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-kube-api-access-plff6\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.065740 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce02d23-60c3-4015-9f92-10c40fe298b2-client-ca\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.065852 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ce02d23-60c3-4015-9f92-10c40fe298b2-proxy-ca-bundles\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.065965 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce02d23-60c3-4015-9f92-10c40fe298b2-serving-cert\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.066499 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-client-ca\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.066630 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-config\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.066713 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce02d23-60c3-4015-9f92-10c40fe298b2-client-ca\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.067187 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1ce02d23-60c3-4015-9f92-10c40fe298b2-proxy-ca-bundles\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.068289 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ce02d23-60c3-4015-9f92-10c40fe298b2-config\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.072987 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ce02d23-60c3-4015-9f92-10c40fe298b2-serving-cert\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.073981 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-serving-cert\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.082953 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxqw9\" (UniqueName: \"kubernetes.io/projected/1ce02d23-60c3-4015-9f92-10c40fe298b2-kube-api-access-fxqw9\") pod \"controller-manager-fd9994dbf-wgbx2\" (UID: \"1ce02d23-60c3-4015-9f92-10c40fe298b2\") " pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.083374 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plff6\" (UniqueName: \"kubernetes.io/projected/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-kube-api-access-plff6\") pod \"route-controller-manager-5495bb6774-k4qlt\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.190303 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.199545 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.246802 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="613fa395-90dd-4c22-8326-57e4dcb5f79f" path="/var/lib/kubelet/pods/613fa395-90dd-4c22-8326-57e4dcb5f79f/volumes" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.248123 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adda0b5e-49cb-4a97-a76c-0f19caddf65a" path="/var/lib/kubelet/pods/adda0b5e-49cb-4a97-a76c-0f19caddf65a/volumes" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.414913 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fd9994dbf-wgbx2"] Feb 01 19:39:36 crc kubenswrapper[4961]: W0201 19:39:36.433789 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ce02d23_60c3_4015_9f92_10c40fe298b2.slice/crio-9ee2adb81e9871aed56689a07c395cde614dac28a56db25f448e26557b38ff97 WatchSource:0}: Error finding container 9ee2adb81e9871aed56689a07c395cde614dac28a56db25f448e26557b38ff97: Status 404 returned error can't find the container with id 9ee2adb81e9871aed56689a07c395cde614dac28a56db25f448e26557b38ff97 Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.633758 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" event={"ID":"1ce02d23-60c3-4015-9f92-10c40fe298b2","Type":"ContainerStarted","Data":"2208c9dcc1c2ac5a281c379546ad0512732b65979050454ad72c2ffc4b9116fe"} Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.634028 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" event={"ID":"1ce02d23-60c3-4015-9f92-10c40fe298b2","Type":"ContainerStarted","Data":"9ee2adb81e9871aed56689a07c395cde614dac28a56db25f448e26557b38ff97"} Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.677687 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" podStartSLOduration=3.677664206 podStartE2EDuration="3.677664206s" podCreationTimestamp="2026-02-01 19:39:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:39:36.673088518 +0000 UTC m=+331.198119941" watchObservedRunningTime="2026-02-01 19:39:36.677664206 +0000 UTC m=+331.202695629" Feb 01 19:39:36 crc kubenswrapper[4961]: I0201 19:39:36.689545 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt"] Feb 01 19:39:36 crc kubenswrapper[4961]: W0201 19:39:36.694301 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dd66f8e_9dcd_4a48_b983_f57e72294ca0.slice/crio-e3757fbf3ad357a7e145981b2765f75f7c321dc67e50052abe4cf72fac6e0308 WatchSource:0}: Error finding container e3757fbf3ad357a7e145981b2765f75f7c321dc67e50052abe4cf72fac6e0308: Status 404 returned error can't find the container with id e3757fbf3ad357a7e145981b2765f75f7c321dc67e50052abe4cf72fac6e0308 Feb 01 19:39:37 crc kubenswrapper[4961]: I0201 19:39:37.638712 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" event={"ID":"6dd66f8e-9dcd-4a48-b983-f57e72294ca0","Type":"ContainerStarted","Data":"e548e86bb21f8dcf8ee3542e71aea8e13b9e1b7b97b4cd287df8e1ffa4d33c3f"} Feb 01 19:39:37 crc kubenswrapper[4961]: I0201 19:39:37.638813 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" event={"ID":"6dd66f8e-9dcd-4a48-b983-f57e72294ca0","Type":"ContainerStarted","Data":"e3757fbf3ad357a7e145981b2765f75f7c321dc67e50052abe4cf72fac6e0308"} Feb 01 19:39:37 crc kubenswrapper[4961]: I0201 19:39:37.638832 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:37 crc kubenswrapper[4961]: I0201 19:39:37.639176 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:37 crc kubenswrapper[4961]: I0201 19:39:37.643159 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fd9994dbf-wgbx2" Feb 01 19:39:37 crc kubenswrapper[4961]: I0201 19:39:37.643385 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:37 crc kubenswrapper[4961]: I0201 19:39:37.664688 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" podStartSLOduration=3.664668303 podStartE2EDuration="3.664668303s" podCreationTimestamp="2026-02-01 19:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:39:37.658643504 +0000 UTC m=+332.183674927" watchObservedRunningTime="2026-02-01 19:39:37.664668303 +0000 UTC m=+332.189699726" Feb 01 19:39:50 crc kubenswrapper[4961]: I0201 19:39:50.281857 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:39:50 crc kubenswrapper[4961]: I0201 19:39:50.282496 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.321147 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-56xxs"] Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.322102 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.324062 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.335907 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56xxs"] Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.367764 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f987a0d7-8d43-4ec5-a1dc-84d65534a69a-catalog-content\") pod \"redhat-operators-56xxs\" (UID: \"f987a0d7-8d43-4ec5-a1dc-84d65534a69a\") " pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.367809 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqfl8\" (UniqueName: \"kubernetes.io/projected/f987a0d7-8d43-4ec5-a1dc-84d65534a69a-kube-api-access-hqfl8\") pod \"redhat-operators-56xxs\" (UID: \"f987a0d7-8d43-4ec5-a1dc-84d65534a69a\") " pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.367836 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f987a0d7-8d43-4ec5-a1dc-84d65534a69a-utilities\") pod \"redhat-operators-56xxs\" (UID: \"f987a0d7-8d43-4ec5-a1dc-84d65534a69a\") " pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.469005 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f987a0d7-8d43-4ec5-a1dc-84d65534a69a-catalog-content\") pod \"redhat-operators-56xxs\" (UID: \"f987a0d7-8d43-4ec5-a1dc-84d65534a69a\") " pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.469115 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqfl8\" (UniqueName: \"kubernetes.io/projected/f987a0d7-8d43-4ec5-a1dc-84d65534a69a-kube-api-access-hqfl8\") pod \"redhat-operators-56xxs\" (UID: \"f987a0d7-8d43-4ec5-a1dc-84d65534a69a\") " pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.469147 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f987a0d7-8d43-4ec5-a1dc-84d65534a69a-utilities\") pod \"redhat-operators-56xxs\" (UID: \"f987a0d7-8d43-4ec5-a1dc-84d65534a69a\") " pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.469574 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f987a0d7-8d43-4ec5-a1dc-84d65534a69a-catalog-content\") pod \"redhat-operators-56xxs\" (UID: \"f987a0d7-8d43-4ec5-a1dc-84d65534a69a\") " pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.469584 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f987a0d7-8d43-4ec5-a1dc-84d65534a69a-utilities\") pod \"redhat-operators-56xxs\" (UID: \"f987a0d7-8d43-4ec5-a1dc-84d65534a69a\") " pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.493426 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqfl8\" (UniqueName: \"kubernetes.io/projected/f987a0d7-8d43-4ec5-a1dc-84d65534a69a-kube-api-access-hqfl8\") pod \"redhat-operators-56xxs\" (UID: \"f987a0d7-8d43-4ec5-a1dc-84d65534a69a\") " pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.524980 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-llcxm"] Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.526274 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.530256 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.536784 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llcxm"] Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.570220 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94fdbb40-b51a-4305-8d95-a6b6a41f917d-utilities\") pod \"community-operators-llcxm\" (UID: \"94fdbb40-b51a-4305-8d95-a6b6a41f917d\") " pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.570273 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85bmq\" (UniqueName: \"kubernetes.io/projected/94fdbb40-b51a-4305-8d95-a6b6a41f917d-kube-api-access-85bmq\") pod \"community-operators-llcxm\" (UID: \"94fdbb40-b51a-4305-8d95-a6b6a41f917d\") " pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.570351 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94fdbb40-b51a-4305-8d95-a6b6a41f917d-catalog-content\") pod \"community-operators-llcxm\" (UID: \"94fdbb40-b51a-4305-8d95-a6b6a41f917d\") " pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.640288 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.670994 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94fdbb40-b51a-4305-8d95-a6b6a41f917d-utilities\") pod \"community-operators-llcxm\" (UID: \"94fdbb40-b51a-4305-8d95-a6b6a41f917d\") " pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.671077 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85bmq\" (UniqueName: \"kubernetes.io/projected/94fdbb40-b51a-4305-8d95-a6b6a41f917d-kube-api-access-85bmq\") pod \"community-operators-llcxm\" (UID: \"94fdbb40-b51a-4305-8d95-a6b6a41f917d\") " pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.671133 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94fdbb40-b51a-4305-8d95-a6b6a41f917d-catalog-content\") pod \"community-operators-llcxm\" (UID: \"94fdbb40-b51a-4305-8d95-a6b6a41f917d\") " pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.671591 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94fdbb40-b51a-4305-8d95-a6b6a41f917d-catalog-content\") pod \"community-operators-llcxm\" (UID: \"94fdbb40-b51a-4305-8d95-a6b6a41f917d\") " pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.671834 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94fdbb40-b51a-4305-8d95-a6b6a41f917d-utilities\") pod \"community-operators-llcxm\" (UID: \"94fdbb40-b51a-4305-8d95-a6b6a41f917d\") " pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.701388 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85bmq\" (UniqueName: \"kubernetes.io/projected/94fdbb40-b51a-4305-8d95-a6b6a41f917d-kube-api-access-85bmq\") pod \"community-operators-llcxm\" (UID: \"94fdbb40-b51a-4305-8d95-a6b6a41f917d\") " pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:39:53 crc kubenswrapper[4961]: I0201 19:39:53.860324 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.086777 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56xxs"] Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.328567 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llcxm"] Feb 01 19:39:54 crc kubenswrapper[4961]: W0201 19:39:54.335118 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94fdbb40_b51a_4305_8d95_a6b6a41f917d.slice/crio-ffbced9a7b1139890a8d6b3c0d5abda97ddeff739621c7f9e486befc91b6c4ed WatchSource:0}: Error finding container ffbced9a7b1139890a8d6b3c0d5abda97ddeff739621c7f9e486befc91b6c4ed: Status 404 returned error can't find the container with id ffbced9a7b1139890a8d6b3c0d5abda97ddeff739621c7f9e486befc91b6c4ed Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.432822 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt"] Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.433333 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" podUID="6dd66f8e-9dcd-4a48-b983-f57e72294ca0" containerName="route-controller-manager" containerID="cri-o://e548e86bb21f8dcf8ee3542e71aea8e13b9e1b7b97b4cd287df8e1ffa4d33c3f" gracePeriod=30 Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.728989 4961 generic.go:334] "Generic (PLEG): container finished" podID="6dd66f8e-9dcd-4a48-b983-f57e72294ca0" containerID="e548e86bb21f8dcf8ee3542e71aea8e13b9e1b7b97b4cd287df8e1ffa4d33c3f" exitCode=0 Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.729117 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" event={"ID":"6dd66f8e-9dcd-4a48-b983-f57e72294ca0","Type":"ContainerDied","Data":"e548e86bb21f8dcf8ee3542e71aea8e13b9e1b7b97b4cd287df8e1ffa4d33c3f"} Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.731170 4961 generic.go:334] "Generic (PLEG): container finished" podID="94fdbb40-b51a-4305-8d95-a6b6a41f917d" containerID="924d17acb0aed4cdccf48f9f785027c0744a5215283022779d9dd860327b652b" exitCode=0 Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.731245 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llcxm" event={"ID":"94fdbb40-b51a-4305-8d95-a6b6a41f917d","Type":"ContainerDied","Data":"924d17acb0aed4cdccf48f9f785027c0744a5215283022779d9dd860327b652b"} Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.731276 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llcxm" event={"ID":"94fdbb40-b51a-4305-8d95-a6b6a41f917d","Type":"ContainerStarted","Data":"ffbced9a7b1139890a8d6b3c0d5abda97ddeff739621c7f9e486befc91b6c4ed"} Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.733628 4961 generic.go:334] "Generic (PLEG): container finished" podID="f987a0d7-8d43-4ec5-a1dc-84d65534a69a" containerID="789b5d21bc9c389462d39b5848e5e16b1619efac5016be4404573f528c293c1a" exitCode=0 Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.733734 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56xxs" event={"ID":"f987a0d7-8d43-4ec5-a1dc-84d65534a69a","Type":"ContainerDied","Data":"789b5d21bc9c389462d39b5848e5e16b1619efac5016be4404573f528c293c1a"} Feb 01 19:39:54 crc kubenswrapper[4961]: I0201 19:39:54.734435 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56xxs" event={"ID":"f987a0d7-8d43-4ec5-a1dc-84d65534a69a","Type":"ContainerStarted","Data":"3d84fa211861e07674846e5cef653dd56f8b444abc9667ca54baac489871a06a"} Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.457835 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.487201 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr"] Feb 01 19:39:55 crc kubenswrapper[4961]: E0201 19:39:55.487428 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dd66f8e-9dcd-4a48-b983-f57e72294ca0" containerName="route-controller-manager" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.487445 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dd66f8e-9dcd-4a48-b983-f57e72294ca0" containerName="route-controller-manager" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.487562 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dd66f8e-9dcd-4a48-b983-f57e72294ca0" containerName="route-controller-manager" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.488004 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.501773 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr"] Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.595147 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-client-ca\") pod \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.595245 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-config\") pod \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.595302 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plff6\" (UniqueName: \"kubernetes.io/projected/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-kube-api-access-plff6\") pod \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.595404 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-serving-cert\") pod \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\" (UID: \"6dd66f8e-9dcd-4a48-b983-f57e72294ca0\") " Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.596096 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe1989e-7caf-43ba-b69b-e192bddff8f5-serving-cert\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.596169 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfpzn\" (UniqueName: \"kubernetes.io/projected/6fe1989e-7caf-43ba-b69b-e192bddff8f5-kube-api-access-gfpzn\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.596227 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe1989e-7caf-43ba-b69b-e192bddff8f5-client-ca\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.596288 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe1989e-7caf-43ba-b69b-e192bddff8f5-config\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.596386 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-client-ca" (OuterVolumeSpecName: "client-ca") pod "6dd66f8e-9dcd-4a48-b983-f57e72294ca0" (UID: "6dd66f8e-9dcd-4a48-b983-f57e72294ca0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.596401 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-config" (OuterVolumeSpecName: "config") pod "6dd66f8e-9dcd-4a48-b983-f57e72294ca0" (UID: "6dd66f8e-9dcd-4a48-b983-f57e72294ca0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.602259 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6dd66f8e-9dcd-4a48-b983-f57e72294ca0" (UID: "6dd66f8e-9dcd-4a48-b983-f57e72294ca0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.611240 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-kube-api-access-plff6" (OuterVolumeSpecName: "kube-api-access-plff6") pod "6dd66f8e-9dcd-4a48-b983-f57e72294ca0" (UID: "6dd66f8e-9dcd-4a48-b983-f57e72294ca0"). InnerVolumeSpecName "kube-api-access-plff6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.697707 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfpzn\" (UniqueName: \"kubernetes.io/projected/6fe1989e-7caf-43ba-b69b-e192bddff8f5-kube-api-access-gfpzn\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.697791 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe1989e-7caf-43ba-b69b-e192bddff8f5-client-ca\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.697839 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe1989e-7caf-43ba-b69b-e192bddff8f5-config\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.697869 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe1989e-7caf-43ba-b69b-e192bddff8f5-serving-cert\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.697919 4961 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.697932 4961 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.697941 4961 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.697951 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plff6\" (UniqueName: \"kubernetes.io/projected/6dd66f8e-9dcd-4a48-b983-f57e72294ca0-kube-api-access-plff6\") on node \"crc\" DevicePath \"\"" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.698941 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe1989e-7caf-43ba-b69b-e192bddff8f5-client-ca\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.699201 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe1989e-7caf-43ba-b69b-e192bddff8f5-config\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.703030 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fe1989e-7caf-43ba-b69b-e192bddff8f5-serving-cert\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.722298 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfpzn\" (UniqueName: \"kubernetes.io/projected/6fe1989e-7caf-43ba-b69b-e192bddff8f5-kube-api-access-gfpzn\") pod \"route-controller-manager-56855464d7-qwqpr\" (UID: \"6fe1989e-7caf-43ba-b69b-e192bddff8f5\") " pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.725859 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8xfhh"] Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.727012 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.728967 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.738715 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8xfhh"] Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.752465 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llcxm" event={"ID":"94fdbb40-b51a-4305-8d95-a6b6a41f917d","Type":"ContainerStarted","Data":"9f86dd36b67166afddb078abaea5137145cb1c50d8ea1858e316e27a9ffa8949"} Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.755248 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56xxs" event={"ID":"f987a0d7-8d43-4ec5-a1dc-84d65534a69a","Type":"ContainerStarted","Data":"92b55334b484c9a773b3c2249b664e29d015ee758d917d2ad5098f41b61277f4"} Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.757716 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" event={"ID":"6dd66f8e-9dcd-4a48-b983-f57e72294ca0","Type":"ContainerDied","Data":"e3757fbf3ad357a7e145981b2765f75f7c321dc67e50052abe4cf72fac6e0308"} Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.757766 4961 scope.go:117] "RemoveContainer" containerID="e548e86bb21f8dcf8ee3542e71aea8e13b9e1b7b97b4cd287df8e1ffa4d33c3f" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.757793 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.799004 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-utilities\") pod \"certified-operators-8xfhh\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.799112 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-catalog-content\") pod \"certified-operators-8xfhh\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.799154 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p4rz\" (UniqueName: \"kubernetes.io/projected/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-kube-api-access-9p4rz\") pod \"certified-operators-8xfhh\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.810421 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.835457 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt"] Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.843623 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5495bb6774-k4qlt"] Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.900246 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p4rz\" (UniqueName: \"kubernetes.io/projected/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-kube-api-access-9p4rz\") pod \"certified-operators-8xfhh\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.900376 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-utilities\") pod \"certified-operators-8xfhh\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.900409 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-catalog-content\") pod \"certified-operators-8xfhh\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.900897 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-utilities\") pod \"certified-operators-8xfhh\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.901133 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-catalog-content\") pod \"certified-operators-8xfhh\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.921807 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h2lhr"] Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.923319 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p4rz\" (UniqueName: \"kubernetes.io/projected/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-kube-api-access-9p4rz\") pod \"certified-operators-8xfhh\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.928024 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.932611 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 01 19:39:55 crc kubenswrapper[4961]: I0201 19:39:55.938175 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2lhr"] Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.001872 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jghjx\" (UniqueName: \"kubernetes.io/projected/7ea3c801-5aaf-41f2-9823-31465d862851-kube-api-access-jghjx\") pod \"redhat-marketplace-h2lhr\" (UID: \"7ea3c801-5aaf-41f2-9823-31465d862851\") " pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.001955 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea3c801-5aaf-41f2-9823-31465d862851-catalog-content\") pod \"redhat-marketplace-h2lhr\" (UID: \"7ea3c801-5aaf-41f2-9823-31465d862851\") " pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.002012 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea3c801-5aaf-41f2-9823-31465d862851-utilities\") pod \"redhat-marketplace-h2lhr\" (UID: \"7ea3c801-5aaf-41f2-9823-31465d862851\") " pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.075296 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.103001 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea3c801-5aaf-41f2-9823-31465d862851-utilities\") pod \"redhat-marketplace-h2lhr\" (UID: \"7ea3c801-5aaf-41f2-9823-31465d862851\") " pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.103082 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jghjx\" (UniqueName: \"kubernetes.io/projected/7ea3c801-5aaf-41f2-9823-31465d862851-kube-api-access-jghjx\") pod \"redhat-marketplace-h2lhr\" (UID: \"7ea3c801-5aaf-41f2-9823-31465d862851\") " pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.103138 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea3c801-5aaf-41f2-9823-31465d862851-catalog-content\") pod \"redhat-marketplace-h2lhr\" (UID: \"7ea3c801-5aaf-41f2-9823-31465d862851\") " pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.103696 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ea3c801-5aaf-41f2-9823-31465d862851-utilities\") pod \"redhat-marketplace-h2lhr\" (UID: \"7ea3c801-5aaf-41f2-9823-31465d862851\") " pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.104889 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ea3c801-5aaf-41f2-9823-31465d862851-catalog-content\") pod \"redhat-marketplace-h2lhr\" (UID: \"7ea3c801-5aaf-41f2-9823-31465d862851\") " pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.118614 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jghjx\" (UniqueName: \"kubernetes.io/projected/7ea3c801-5aaf-41f2-9823-31465d862851-kube-api-access-jghjx\") pod \"redhat-marketplace-h2lhr\" (UID: \"7ea3c801-5aaf-41f2-9823-31465d862851\") " pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.211158 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr"] Feb 01 19:39:56 crc kubenswrapper[4961]: W0201 19:39:56.212947 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fe1989e_7caf_43ba_b69b_e192bddff8f5.slice/crio-8e294ab2ebd1b721124e7243eeb23aaf20d90c9fcbf0e782f53a192b2383f2d0 WatchSource:0}: Error finding container 8e294ab2ebd1b721124e7243eeb23aaf20d90c9fcbf0e782f53a192b2383f2d0: Status 404 returned error can't find the container with id 8e294ab2ebd1b721124e7243eeb23aaf20d90c9fcbf0e782f53a192b2383f2d0 Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.246583 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.247118 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dd66f8e-9dcd-4a48-b983-f57e72294ca0" path="/var/lib/kubelet/pods/6dd66f8e-9dcd-4a48-b983-f57e72294ca0/volumes" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.475645 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8xfhh"] Feb 01 19:39:56 crc kubenswrapper[4961]: W0201 19:39:56.480406 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode18b1c2a_910a_4e83_bcd3_6b5cc87b6d52.slice/crio-3446b1be5d339b0a53b9106ba8be8d3c70641deae02b2db5c4e4b02487adb5da WatchSource:0}: Error finding container 3446b1be5d339b0a53b9106ba8be8d3c70641deae02b2db5c4e4b02487adb5da: Status 404 returned error can't find the container with id 3446b1be5d339b0a53b9106ba8be8d3c70641deae02b2db5c4e4b02487adb5da Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.658498 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h2lhr"] Feb 01 19:39:56 crc kubenswrapper[4961]: W0201 19:39:56.709691 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ea3c801_5aaf_41f2_9823_31465d862851.slice/crio-e7fc1649c3a79d7ab7a07a6191e41cd46c3f4565979cfa7ea7a78d17fd6db572 WatchSource:0}: Error finding container e7fc1649c3a79d7ab7a07a6191e41cd46c3f4565979cfa7ea7a78d17fd6db572: Status 404 returned error can't find the container with id e7fc1649c3a79d7ab7a07a6191e41cd46c3f4565979cfa7ea7a78d17fd6db572 Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.766648 4961 generic.go:334] "Generic (PLEG): container finished" podID="f987a0d7-8d43-4ec5-a1dc-84d65534a69a" containerID="92b55334b484c9a773b3c2249b664e29d015ee758d917d2ad5098f41b61277f4" exitCode=0 Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.766751 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56xxs" event={"ID":"f987a0d7-8d43-4ec5-a1dc-84d65534a69a","Type":"ContainerDied","Data":"92b55334b484c9a773b3c2249b664e29d015ee758d917d2ad5098f41b61277f4"} Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.770586 4961 generic.go:334] "Generic (PLEG): container finished" podID="94fdbb40-b51a-4305-8d95-a6b6a41f917d" containerID="9f86dd36b67166afddb078abaea5137145cb1c50d8ea1858e316e27a9ffa8949" exitCode=0 Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.770667 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llcxm" event={"ID":"94fdbb40-b51a-4305-8d95-a6b6a41f917d","Type":"ContainerDied","Data":"9f86dd36b67166afddb078abaea5137145cb1c50d8ea1858e316e27a9ffa8949"} Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.774239 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" event={"ID":"6fe1989e-7caf-43ba-b69b-e192bddff8f5","Type":"ContainerStarted","Data":"322bd29f782dd6f85f8490a9aa1e6bd28a9495cfc100ebb5d168106ab292dcda"} Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.774299 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" event={"ID":"6fe1989e-7caf-43ba-b69b-e192bddff8f5","Type":"ContainerStarted","Data":"8e294ab2ebd1b721124e7243eeb23aaf20d90c9fcbf0e782f53a192b2383f2d0"} Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.774427 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.782013 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xfhh" event={"ID":"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52","Type":"ContainerDied","Data":"5c5eb6666c02f8dd488105c4e2eab6865657b419e4520ada5d73b1bcb53372b3"} Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.782363 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.782515 4961 generic.go:334] "Generic (PLEG): container finished" podID="e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52" containerID="5c5eb6666c02f8dd488105c4e2eab6865657b419e4520ada5d73b1bcb53372b3" exitCode=0 Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.782625 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xfhh" event={"ID":"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52","Type":"ContainerStarted","Data":"3446b1be5d339b0a53b9106ba8be8d3c70641deae02b2db5c4e4b02487adb5da"} Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.784707 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2lhr" event={"ID":"7ea3c801-5aaf-41f2-9823-31465d862851","Type":"ContainerStarted","Data":"e7fc1649c3a79d7ab7a07a6191e41cd46c3f4565979cfa7ea7a78d17fd6db572"} Feb 01 19:39:56 crc kubenswrapper[4961]: I0201 19:39:56.819785 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56855464d7-qwqpr" podStartSLOduration=2.819769216 podStartE2EDuration="2.819769216s" podCreationTimestamp="2026-02-01 19:39:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:39:56.81809716 +0000 UTC m=+351.343128583" watchObservedRunningTime="2026-02-01 19:39:56.819769216 +0000 UTC m=+351.344800639" Feb 01 19:39:57 crc kubenswrapper[4961]: I0201 19:39:57.791791 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xfhh" event={"ID":"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52","Type":"ContainerStarted","Data":"bb478315b62e8809827141bd6800aae54ac451e96320a864745034329c79903a"} Feb 01 19:39:57 crc kubenswrapper[4961]: I0201 19:39:57.793926 4961 generic.go:334] "Generic (PLEG): container finished" podID="7ea3c801-5aaf-41f2-9823-31465d862851" containerID="a0f237f31e451dacf9b61a5e285db0f2ca7d3466faf2b3a6ce1c8b09710e03b2" exitCode=0 Feb 01 19:39:57 crc kubenswrapper[4961]: I0201 19:39:57.793985 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2lhr" event={"ID":"7ea3c801-5aaf-41f2-9823-31465d862851","Type":"ContainerDied","Data":"a0f237f31e451dacf9b61a5e285db0f2ca7d3466faf2b3a6ce1c8b09710e03b2"} Feb 01 19:39:57 crc kubenswrapper[4961]: I0201 19:39:57.796728 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llcxm" event={"ID":"94fdbb40-b51a-4305-8d95-a6b6a41f917d","Type":"ContainerStarted","Data":"4df65a12e71cb2c8d2a16f38f116b1b52ab19432b6b6fc2cc7414b1aad3deaa9"} Feb 01 19:39:57 crc kubenswrapper[4961]: I0201 19:39:57.800483 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56xxs" event={"ID":"f987a0d7-8d43-4ec5-a1dc-84d65534a69a","Type":"ContainerStarted","Data":"18b284408ed01a1af4748c223095d7dea1c7335a0318ff27db4758ca0797eb21"} Feb 01 19:39:57 crc kubenswrapper[4961]: I0201 19:39:57.834580 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-llcxm" podStartSLOduration=2.356269736 podStartE2EDuration="4.834561065s" podCreationTimestamp="2026-02-01 19:39:53 +0000 UTC" firstStartedPulling="2026-02-01 19:39:54.732933462 +0000 UTC m=+349.257964885" lastFinishedPulling="2026-02-01 19:39:57.211224791 +0000 UTC m=+351.736256214" observedRunningTime="2026-02-01 19:39:57.826248291 +0000 UTC m=+352.351279714" watchObservedRunningTime="2026-02-01 19:39:57.834561065 +0000 UTC m=+352.359592488" Feb 01 19:39:57 crc kubenswrapper[4961]: I0201 19:39:57.867663 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-56xxs" podStartSLOduration=2.351378106 podStartE2EDuration="4.867642744s" podCreationTimestamp="2026-02-01 19:39:53 +0000 UTC" firstStartedPulling="2026-02-01 19:39:54.735667998 +0000 UTC m=+349.260699431" lastFinishedPulling="2026-02-01 19:39:57.251932646 +0000 UTC m=+351.776964069" observedRunningTime="2026-02-01 19:39:57.862970614 +0000 UTC m=+352.388002047" watchObservedRunningTime="2026-02-01 19:39:57.867642744 +0000 UTC m=+352.392674167" Feb 01 19:39:58 crc kubenswrapper[4961]: I0201 19:39:58.806190 4961 generic.go:334] "Generic (PLEG): container finished" podID="e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52" containerID="bb478315b62e8809827141bd6800aae54ac451e96320a864745034329c79903a" exitCode=0 Feb 01 19:39:58 crc kubenswrapper[4961]: I0201 19:39:58.806496 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xfhh" event={"ID":"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52","Type":"ContainerDied","Data":"bb478315b62e8809827141bd6800aae54ac451e96320a864745034329c79903a"} Feb 01 19:39:58 crc kubenswrapper[4961]: I0201 19:39:58.808801 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2lhr" event={"ID":"7ea3c801-5aaf-41f2-9823-31465d862851","Type":"ContainerStarted","Data":"4743852325fadbd159b8f6092ff05643b5cdc6783a704f9fbee1e433e83bb2dc"} Feb 01 19:39:59 crc kubenswrapper[4961]: I0201 19:39:59.816663 4961 generic.go:334] "Generic (PLEG): container finished" podID="7ea3c801-5aaf-41f2-9823-31465d862851" containerID="4743852325fadbd159b8f6092ff05643b5cdc6783a704f9fbee1e433e83bb2dc" exitCode=0 Feb 01 19:39:59 crc kubenswrapper[4961]: I0201 19:39:59.816761 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2lhr" event={"ID":"7ea3c801-5aaf-41f2-9823-31465d862851","Type":"ContainerDied","Data":"4743852325fadbd159b8f6092ff05643b5cdc6783a704f9fbee1e433e83bb2dc"} Feb 01 19:39:59 crc kubenswrapper[4961]: I0201 19:39:59.819666 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xfhh" event={"ID":"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52","Type":"ContainerStarted","Data":"a20ccecafb6394299d55482bcdd64a73c9b9e7f82d4c3d27707c9b11371b7e4d"} Feb 01 19:39:59 crc kubenswrapper[4961]: I0201 19:39:59.856180 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8xfhh" podStartSLOduration=2.4556912029999998 podStartE2EDuration="4.856162987s" podCreationTimestamp="2026-02-01 19:39:55 +0000 UTC" firstStartedPulling="2026-02-01 19:39:56.792727776 +0000 UTC m=+351.317759199" lastFinishedPulling="2026-02-01 19:39:59.19319956 +0000 UTC m=+353.718230983" observedRunningTime="2026-02-01 19:39:59.851430084 +0000 UTC m=+354.376461527" watchObservedRunningTime="2026-02-01 19:39:59.856162987 +0000 UTC m=+354.381194410" Feb 01 19:40:00 crc kubenswrapper[4961]: I0201 19:40:00.826387 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h2lhr" event={"ID":"7ea3c801-5aaf-41f2-9823-31465d862851","Type":"ContainerStarted","Data":"2ab48376a2a537bce89fcafa345c9b682e01bebb14aa2d78f7fa80cee1d3b37b"} Feb 01 19:40:03 crc kubenswrapper[4961]: I0201 19:40:03.641380 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:40:03 crc kubenswrapper[4961]: I0201 19:40:03.641917 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:40:03 crc kubenswrapper[4961]: I0201 19:40:03.683470 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:40:03 crc kubenswrapper[4961]: I0201 19:40:03.709395 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h2lhr" podStartSLOduration=6.238986892 podStartE2EDuration="8.70937906s" podCreationTimestamp="2026-02-01 19:39:55 +0000 UTC" firstStartedPulling="2026-02-01 19:39:57.79525784 +0000 UTC m=+352.320289263" lastFinishedPulling="2026-02-01 19:40:00.265650008 +0000 UTC m=+354.790681431" observedRunningTime="2026-02-01 19:40:00.849489032 +0000 UTC m=+355.374520455" watchObservedRunningTime="2026-02-01 19:40:03.70937906 +0000 UTC m=+358.234410483" Feb 01 19:40:03 crc kubenswrapper[4961]: I0201 19:40:03.861076 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:40:03 crc kubenswrapper[4961]: I0201 19:40:03.861405 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:40:03 crc kubenswrapper[4961]: I0201 19:40:03.888448 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-56xxs" Feb 01 19:40:03 crc kubenswrapper[4961]: I0201 19:40:03.902550 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:40:04 crc kubenswrapper[4961]: I0201 19:40:04.880761 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-llcxm" Feb 01 19:40:06 crc kubenswrapper[4961]: I0201 19:40:06.075944 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:40:06 crc kubenswrapper[4961]: I0201 19:40:06.076262 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:40:06 crc kubenswrapper[4961]: I0201 19:40:06.114727 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:40:06 crc kubenswrapper[4961]: I0201 19:40:06.247099 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:40:06 crc kubenswrapper[4961]: I0201 19:40:06.247225 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:40:06 crc kubenswrapper[4961]: I0201 19:40:06.293921 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:40:06 crc kubenswrapper[4961]: I0201 19:40:06.893539 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h2lhr" Feb 01 19:40:06 crc kubenswrapper[4961]: I0201 19:40:06.906935 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:40:19 crc kubenswrapper[4961]: I0201 19:40:19.966949 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w56td"] Feb 01 19:40:19 crc kubenswrapper[4961]: I0201 19:40:19.968101 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.026072 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w56td"] Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.084956 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6vpq\" (UniqueName: \"kubernetes.io/projected/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-kube-api-access-h6vpq\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.085005 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.085055 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-registry-tls\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.085098 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-bound-sa-token\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.085141 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.085166 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.085256 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-trusted-ca\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.085301 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-registry-certificates\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.122027 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.186004 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-bound-sa-token\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.186090 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.186119 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-trusted-ca\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.186142 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-registry-certificates\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.186209 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6vpq\" (UniqueName: \"kubernetes.io/projected/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-kube-api-access-h6vpq\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.186239 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.186270 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-registry-tls\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.186684 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.189373 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-registry-certificates\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.189479 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-trusted-ca\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.192825 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.200922 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-registry-tls\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.202815 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-bound-sa-token\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.202972 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6vpq\" (UniqueName: \"kubernetes.io/projected/da1d1b46-ec0f-4b67-8fb1-8902ef61a1db-kube-api-access-h6vpq\") pod \"image-registry-66df7c8f76-w56td\" (UID: \"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db\") " pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.281656 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.281715 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.282864 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.706022 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w56td"] Feb 01 19:40:20 crc kubenswrapper[4961]: W0201 19:40:20.712293 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda1d1b46_ec0f_4b67_8fb1_8902ef61a1db.slice/crio-652a54c0cd978aab67d62ea07e7a9a5f2db95c74823fbea559e5c01091c5a3d2 WatchSource:0}: Error finding container 652a54c0cd978aab67d62ea07e7a9a5f2db95c74823fbea559e5c01091c5a3d2: Status 404 returned error can't find the container with id 652a54c0cd978aab67d62ea07e7a9a5f2db95c74823fbea559e5c01091c5a3d2 Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.928960 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w56td" event={"ID":"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db","Type":"ContainerStarted","Data":"a3001f6e008d6a7ea3d7c68afa0f785be3d7b4d01ec6208dccfeb90e6d8525e0"} Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.929012 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w56td" event={"ID":"da1d1b46-ec0f-4b67-8fb1-8902ef61a1db","Type":"ContainerStarted","Data":"652a54c0cd978aab67d62ea07e7a9a5f2db95c74823fbea559e5c01091c5a3d2"} Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.929141 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:20 crc kubenswrapper[4961]: I0201 19:40:20.948615 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-w56td" podStartSLOduration=1.948598966 podStartE2EDuration="1.948598966s" podCreationTimestamp="2026-02-01 19:40:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:40:20.946734493 +0000 UTC m=+375.471765926" watchObservedRunningTime="2026-02-01 19:40:20.948598966 +0000 UTC m=+375.473630389" Feb 01 19:40:40 crc kubenswrapper[4961]: I0201 19:40:40.294880 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-w56td" Feb 01 19:40:40 crc kubenswrapper[4961]: I0201 19:40:40.364564 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnz77"] Feb 01 19:40:50 crc kubenswrapper[4961]: I0201 19:40:50.281933 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:40:50 crc kubenswrapper[4961]: I0201 19:40:50.282690 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:40:50 crc kubenswrapper[4961]: I0201 19:40:50.282751 4961 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:40:50 crc kubenswrapper[4961]: I0201 19:40:50.283639 4961 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e080d5473795bf00d433b386421de436789648c183cfb19241bd3f1c6492a31"} pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 19:40:50 crc kubenswrapper[4961]: I0201 19:40:50.283745 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" containerID="cri-o://4e080d5473795bf00d433b386421de436789648c183cfb19241bd3f1c6492a31" gracePeriod=600 Feb 01 19:40:51 crc kubenswrapper[4961]: I0201 19:40:51.131969 4961 generic.go:334] "Generic (PLEG): container finished" podID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerID="4e080d5473795bf00d433b386421de436789648c183cfb19241bd3f1c6492a31" exitCode=0 Feb 01 19:40:51 crc kubenswrapper[4961]: I0201 19:40:51.132085 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerDied","Data":"4e080d5473795bf00d433b386421de436789648c183cfb19241bd3f1c6492a31"} Feb 01 19:40:51 crc kubenswrapper[4961]: I0201 19:40:51.132235 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerStarted","Data":"972c05ccdf151b9ff7d0c1158a0bac29ca46456db3fb0cfe800bc1f1dbd56707"} Feb 01 19:40:51 crc kubenswrapper[4961]: I0201 19:40:51.132266 4961 scope.go:117] "RemoveContainer" containerID="f52d8aba828d3a5a093d42fe9a004ec233ce7876d721f45902f62cfc7c958361" Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.404622 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" podUID="5723f9af-aa2e-4113-9e47-90f01ea33631" containerName="registry" containerID="cri-o://c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539" gracePeriod=30 Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.815100 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.960945 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-certificates\") pod \"5723f9af-aa2e-4113-9e47-90f01ea33631\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.961104 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm9zj\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-kube-api-access-pm9zj\") pod \"5723f9af-aa2e-4113-9e47-90f01ea33631\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.961149 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5723f9af-aa2e-4113-9e47-90f01ea33631-ca-trust-extracted\") pod \"5723f9af-aa2e-4113-9e47-90f01ea33631\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.961593 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"5723f9af-aa2e-4113-9e47-90f01ea33631\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.961672 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-bound-sa-token\") pod \"5723f9af-aa2e-4113-9e47-90f01ea33631\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.961701 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-trusted-ca\") pod \"5723f9af-aa2e-4113-9e47-90f01ea33631\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.961788 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5723f9af-aa2e-4113-9e47-90f01ea33631-installation-pull-secrets\") pod \"5723f9af-aa2e-4113-9e47-90f01ea33631\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.961858 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-tls\") pod \"5723f9af-aa2e-4113-9e47-90f01ea33631\" (UID: \"5723f9af-aa2e-4113-9e47-90f01ea33631\") " Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.963274 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5723f9af-aa2e-4113-9e47-90f01ea33631" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.963491 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5723f9af-aa2e-4113-9e47-90f01ea33631" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.971545 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-kube-api-access-pm9zj" (OuterVolumeSpecName: "kube-api-access-pm9zj") pod "5723f9af-aa2e-4113-9e47-90f01ea33631" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631"). InnerVolumeSpecName "kube-api-access-pm9zj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.971650 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5723f9af-aa2e-4113-9e47-90f01ea33631-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5723f9af-aa2e-4113-9e47-90f01ea33631" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.973777 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5723f9af-aa2e-4113-9e47-90f01ea33631" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.973996 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5723f9af-aa2e-4113-9e47-90f01ea33631" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.975580 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "5723f9af-aa2e-4113-9e47-90f01ea33631" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 01 19:41:05 crc kubenswrapper[4961]: I0201 19:41:05.990790 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5723f9af-aa2e-4113-9e47-90f01ea33631-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5723f9af-aa2e-4113-9e47-90f01ea33631" (UID: "5723f9af-aa2e-4113-9e47-90f01ea33631"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.063399 4961 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.063437 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm9zj\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-kube-api-access-pm9zj\") on node \"crc\" DevicePath \"\"" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.063450 4961 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5723f9af-aa2e-4113-9e47-90f01ea33631-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.063459 4961 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.063467 4961 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5723f9af-aa2e-4113-9e47-90f01ea33631-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.063474 4961 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5723f9af-aa2e-4113-9e47-90f01ea33631-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.063482 4961 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5723f9af-aa2e-4113-9e47-90f01ea33631-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.235339 4961 generic.go:334] "Generic (PLEG): container finished" podID="5723f9af-aa2e-4113-9e47-90f01ea33631" containerID="c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539" exitCode=0 Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.235407 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.242031 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" event={"ID":"5723f9af-aa2e-4113-9e47-90f01ea33631","Type":"ContainerDied","Data":"c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539"} Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.242124 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lnz77" event={"ID":"5723f9af-aa2e-4113-9e47-90f01ea33631","Type":"ContainerDied","Data":"7c85fe4d697e3451ebd2586e24f5b0baaff324ee8b5b3d137e9aa42417819323"} Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.242156 4961 scope.go:117] "RemoveContainer" containerID="c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.264501 4961 scope.go:117] "RemoveContainer" containerID="c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539" Feb 01 19:41:06 crc kubenswrapper[4961]: E0201 19:41:06.265070 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539\": container with ID starting with c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539 not found: ID does not exist" containerID="c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.265129 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539"} err="failed to get container status \"c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539\": rpc error: code = NotFound desc = could not find container \"c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539\": container with ID starting with c68b2cb714905afe1cd5107fc17ac4122e6217c40506bc64ae57b4be3a842539 not found: ID does not exist" Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.290159 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnz77"] Feb 01 19:41:06 crc kubenswrapper[4961]: I0201 19:41:06.297115 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnz77"] Feb 01 19:41:08 crc kubenswrapper[4961]: I0201 19:41:08.247535 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5723f9af-aa2e-4113-9e47-90f01ea33631" path="/var/lib/kubelet/pods/5723f9af-aa2e-4113-9e47-90f01ea33631/volumes" Feb 01 19:42:50 crc kubenswrapper[4961]: I0201 19:42:50.286369 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:42:50 crc kubenswrapper[4961]: I0201 19:42:50.287781 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:43:06 crc kubenswrapper[4961]: I0201 19:43:06.406448 4961 scope.go:117] "RemoveContainer" containerID="c07a406837f821ca868f967385f43d2f94d998c8f2582ed9c981759f9e0fa52a" Feb 01 19:43:20 crc kubenswrapper[4961]: I0201 19:43:20.281674 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:43:20 crc kubenswrapper[4961]: I0201 19:43:20.282484 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.827276 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-grqlz"] Feb 01 19:43:25 crc kubenswrapper[4961]: E0201 19:43:25.828161 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5723f9af-aa2e-4113-9e47-90f01ea33631" containerName="registry" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.828178 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="5723f9af-aa2e-4113-9e47-90f01ea33631" containerName="registry" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.828297 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="5723f9af-aa2e-4113-9e47-90f01ea33631" containerName="registry" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.828739 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-grqlz" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.831241 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.831493 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.831679 4961 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-hrjk7" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.838435 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-4hjhf"] Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.839338 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-4hjhf" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.844972 4961 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-b7rnp" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.848890 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-grqlz"] Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.853275 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2h4t6"] Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.854017 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-2h4t6" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.855752 4961 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-hbbfx" Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.858019 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-4hjhf"] Feb 01 19:43:25 crc kubenswrapper[4961]: I0201 19:43:25.913177 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2h4t6"] Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.020613 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pknww\" (UniqueName: \"kubernetes.io/projected/0459a3cb-30ac-42fa-8de3-96d9a3d6dbe0-kube-api-access-pknww\") pod \"cert-manager-cainjector-cf98fcc89-grqlz\" (UID: \"0459a3cb-30ac-42fa-8de3-96d9a3d6dbe0\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-grqlz" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.020663 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd54r\" (UniqueName: \"kubernetes.io/projected/f8ee4e6a-e069-4269-bff1-bd1c5df23795-kube-api-access-hd54r\") pod \"cert-manager-858654f9db-4hjhf\" (UID: \"f8ee4e6a-e069-4269-bff1-bd1c5df23795\") " pod="cert-manager/cert-manager-858654f9db-4hjhf" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.020704 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjw2z\" (UniqueName: \"kubernetes.io/projected/c026d1b9-4f13-4322-86de-e329b5e64b26-kube-api-access-gjw2z\") pod \"cert-manager-webhook-687f57d79b-2h4t6\" (UID: \"c026d1b9-4f13-4322-86de-e329b5e64b26\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2h4t6" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.121772 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pknww\" (UniqueName: \"kubernetes.io/projected/0459a3cb-30ac-42fa-8de3-96d9a3d6dbe0-kube-api-access-pknww\") pod \"cert-manager-cainjector-cf98fcc89-grqlz\" (UID: \"0459a3cb-30ac-42fa-8de3-96d9a3d6dbe0\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-grqlz" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.122202 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd54r\" (UniqueName: \"kubernetes.io/projected/f8ee4e6a-e069-4269-bff1-bd1c5df23795-kube-api-access-hd54r\") pod \"cert-manager-858654f9db-4hjhf\" (UID: \"f8ee4e6a-e069-4269-bff1-bd1c5df23795\") " pod="cert-manager/cert-manager-858654f9db-4hjhf" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.122476 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjw2z\" (UniqueName: \"kubernetes.io/projected/c026d1b9-4f13-4322-86de-e329b5e64b26-kube-api-access-gjw2z\") pod \"cert-manager-webhook-687f57d79b-2h4t6\" (UID: \"c026d1b9-4f13-4322-86de-e329b5e64b26\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2h4t6" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.142171 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pknww\" (UniqueName: \"kubernetes.io/projected/0459a3cb-30ac-42fa-8de3-96d9a3d6dbe0-kube-api-access-pknww\") pod \"cert-manager-cainjector-cf98fcc89-grqlz\" (UID: \"0459a3cb-30ac-42fa-8de3-96d9a3d6dbe0\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-grqlz" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.144417 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjw2z\" (UniqueName: \"kubernetes.io/projected/c026d1b9-4f13-4322-86de-e329b5e64b26-kube-api-access-gjw2z\") pod \"cert-manager-webhook-687f57d79b-2h4t6\" (UID: \"c026d1b9-4f13-4322-86de-e329b5e64b26\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2h4t6" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.145411 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd54r\" (UniqueName: \"kubernetes.io/projected/f8ee4e6a-e069-4269-bff1-bd1c5df23795-kube-api-access-hd54r\") pod \"cert-manager-858654f9db-4hjhf\" (UID: \"f8ee4e6a-e069-4269-bff1-bd1c5df23795\") " pod="cert-manager/cert-manager-858654f9db-4hjhf" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.156699 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-grqlz" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.168032 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-4hjhf" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.183365 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-2h4t6" Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.416318 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-4hjhf"] Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.432380 4961 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.669331 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-grqlz"] Feb 01 19:43:26 crc kubenswrapper[4961]: I0201 19:43:26.672315 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2h4t6"] Feb 01 19:43:26 crc kubenswrapper[4961]: W0201 19:43:26.680330 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0459a3cb_30ac_42fa_8de3_96d9a3d6dbe0.slice/crio-0e2abc6b4dd262942102b09286add95c63692d8b160fe5ad1ec394e9b25a7dd9 WatchSource:0}: Error finding container 0e2abc6b4dd262942102b09286add95c63692d8b160fe5ad1ec394e9b25a7dd9: Status 404 returned error can't find the container with id 0e2abc6b4dd262942102b09286add95c63692d8b160fe5ad1ec394e9b25a7dd9 Feb 01 19:43:27 crc kubenswrapper[4961]: I0201 19:43:27.145948 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-2h4t6" event={"ID":"c026d1b9-4f13-4322-86de-e329b5e64b26","Type":"ContainerStarted","Data":"7faa29f5e8200ca50638d5d5e4bcf48a867e61053c0806076255661a8a92d5eb"} Feb 01 19:43:27 crc kubenswrapper[4961]: I0201 19:43:27.147433 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-grqlz" event={"ID":"0459a3cb-30ac-42fa-8de3-96d9a3d6dbe0","Type":"ContainerStarted","Data":"0e2abc6b4dd262942102b09286add95c63692d8b160fe5ad1ec394e9b25a7dd9"} Feb 01 19:43:27 crc kubenswrapper[4961]: I0201 19:43:27.148906 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-4hjhf" event={"ID":"f8ee4e6a-e069-4269-bff1-bd1c5df23795","Type":"ContainerStarted","Data":"e32fef43caa32fb1ce425cb268b85e6658d7493e585ed72636173b32d0de26c3"} Feb 01 19:43:31 crc kubenswrapper[4961]: I0201 19:43:31.170941 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-4hjhf" event={"ID":"f8ee4e6a-e069-4269-bff1-bd1c5df23795","Type":"ContainerStarted","Data":"b0c0b406fc57ea06f52e12371b144b66416fe33153b401bfdc2103485dd48517"} Feb 01 19:43:31 crc kubenswrapper[4961]: I0201 19:43:31.172762 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-2h4t6" event={"ID":"c026d1b9-4f13-4322-86de-e329b5e64b26","Type":"ContainerStarted","Data":"ee6dcd21b882a4da03faeb552d5e93f5d19974dd9fd7fe73c79d63961501e3cd"} Feb 01 19:43:31 crc kubenswrapper[4961]: I0201 19:43:31.172855 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-2h4t6" Feb 01 19:43:31 crc kubenswrapper[4961]: I0201 19:43:31.174262 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-grqlz" event={"ID":"0459a3cb-30ac-42fa-8de3-96d9a3d6dbe0","Type":"ContainerStarted","Data":"d944e04c1aedce6e82d5589fbb20cd902b2ba63c013b4386602a1f6122d8031a"} Feb 01 19:43:31 crc kubenswrapper[4961]: I0201 19:43:31.192728 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-4hjhf" podStartSLOduration=2.204328556 podStartE2EDuration="6.192701834s" podCreationTimestamp="2026-02-01 19:43:25 +0000 UTC" firstStartedPulling="2026-02-01 19:43:26.432191446 +0000 UTC m=+560.957222859" lastFinishedPulling="2026-02-01 19:43:30.420564714 +0000 UTC m=+564.945596137" observedRunningTime="2026-02-01 19:43:31.184433508 +0000 UTC m=+565.709464931" watchObservedRunningTime="2026-02-01 19:43:31.192701834 +0000 UTC m=+565.717733277" Feb 01 19:43:31 crc kubenswrapper[4961]: I0201 19:43:31.246440 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-grqlz" podStartSLOduration=2.369783233 podStartE2EDuration="6.246419671s" podCreationTimestamp="2026-02-01 19:43:25 +0000 UTC" firstStartedPulling="2026-02-01 19:43:26.684968393 +0000 UTC m=+561.209999816" lastFinishedPulling="2026-02-01 19:43:30.561604821 +0000 UTC m=+565.086636254" observedRunningTime="2026-02-01 19:43:31.244822455 +0000 UTC m=+565.769853888" watchObservedRunningTime="2026-02-01 19:43:31.246419671 +0000 UTC m=+565.771451104" Feb 01 19:43:35 crc kubenswrapper[4961]: I0201 19:43:35.924860 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-2h4t6" podStartSLOduration=7.100256516 podStartE2EDuration="10.924829585s" podCreationTimestamp="2026-02-01 19:43:25 +0000 UTC" firstStartedPulling="2026-02-01 19:43:26.680778559 +0000 UTC m=+561.205809982" lastFinishedPulling="2026-02-01 19:43:30.505351628 +0000 UTC m=+565.030383051" observedRunningTime="2026-02-01 19:43:31.265813557 +0000 UTC m=+565.790844970" watchObservedRunningTime="2026-02-01 19:43:35.924829585 +0000 UTC m=+570.449861028" Feb 01 19:43:35 crc kubenswrapper[4961]: I0201 19:43:35.926895 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ng92q"] Feb 01 19:43:35 crc kubenswrapper[4961]: I0201 19:43:35.927489 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovn-controller" containerID="cri-o://3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9" gracePeriod=30 Feb 01 19:43:35 crc kubenswrapper[4961]: I0201 19:43:35.927584 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="nbdb" containerID="cri-o://3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65" gracePeriod=30 Feb 01 19:43:35 crc kubenswrapper[4961]: I0201 19:43:35.927614 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="sbdb" containerID="cri-o://c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3" gracePeriod=30 Feb 01 19:43:35 crc kubenswrapper[4961]: I0201 19:43:35.927661 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="kube-rbac-proxy-node" containerID="cri-o://4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054" gracePeriod=30 Feb 01 19:43:35 crc kubenswrapper[4961]: I0201 19:43:35.927698 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovn-acl-logging" containerID="cri-o://1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311" gracePeriod=30 Feb 01 19:43:35 crc kubenswrapper[4961]: I0201 19:43:35.927728 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c" gracePeriod=30 Feb 01 19:43:35 crc kubenswrapper[4961]: I0201 19:43:35.931741 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="northd" containerID="cri-o://6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea" gracePeriod=30 Feb 01 19:43:35 crc kubenswrapper[4961]: I0201 19:43:35.973280 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" containerID="cri-o://82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4" gracePeriod=30 Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.186410 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-2h4t6" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.205257 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovnkube-controller/3.log" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.208388 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovn-acl-logging/0.log" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.208906 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovn-controller/0.log" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209280 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4" exitCode=0 Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209310 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3" exitCode=0 Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209320 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65" exitCode=0 Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209330 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c" exitCode=0 Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209340 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054" exitCode=0 Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209349 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311" exitCode=143 Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209358 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9" exitCode=143 Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209402 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4"} Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209435 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3"} Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209448 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65"} Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209459 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c"} Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209470 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054"} Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209481 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311"} Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209492 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9"} Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.209511 4961 scope.go:117] "RemoveContainer" containerID="cc2ab01b5f6d65cb956fd85a41866e80b2ea247892c8ae559d79915589f1b170" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.213909 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wcr65_9a7db599-5d41-4e19-b703-6bb56822b4ff/kube-multus/2.log" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.214490 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wcr65_9a7db599-5d41-4e19-b703-6bb56822b4ff/kube-multus/1.log" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.214519 4961 generic.go:334] "Generic (PLEG): container finished" podID="9a7db599-5d41-4e19-b703-6bb56822b4ff" containerID="bd2cc8257fc155aebcaa80268feee1ab6dfb4db90cc47bc8926ec8cb0bb5fd06" exitCode=2 Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.214536 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wcr65" event={"ID":"9a7db599-5d41-4e19-b703-6bb56822b4ff","Type":"ContainerDied","Data":"bd2cc8257fc155aebcaa80268feee1ab6dfb4db90cc47bc8926ec8cb0bb5fd06"} Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.214938 4961 scope.go:117] "RemoveContainer" containerID="bd2cc8257fc155aebcaa80268feee1ab6dfb4db90cc47bc8926ec8cb0bb5fd06" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.215128 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wcr65_openshift-multus(9a7db599-5d41-4e19-b703-6bb56822b4ff)\"" pod="openshift-multus/multus-wcr65" podUID="9a7db599-5d41-4e19-b703-6bb56822b4ff" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.245991 4961 scope.go:117] "RemoveContainer" containerID="fb41218a133691c40c5cf0418e394a5a62b3036af754f0990622a4fa579377c9" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.591223 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovn-acl-logging/0.log" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.591687 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovn-controller/0.log" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.592082 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662136 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jmlkm"] Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662498 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="northd" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662527 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="northd" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662543 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovn-acl-logging" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662551 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovn-acl-logging" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662560 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662571 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662581 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovn-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662589 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovn-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662602 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="kube-rbac-proxy-node" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662609 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="kube-rbac-proxy-node" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662623 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="sbdb" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662631 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="sbdb" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662642 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662649 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662657 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="kubecfg-setup" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662665 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="kubecfg-setup" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662677 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662684 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662692 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="kube-rbac-proxy-ovn-metrics" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662699 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="kube-rbac-proxy-ovn-metrics" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662711 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="nbdb" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662719 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="nbdb" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.662729 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662737 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662854 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="kube-rbac-proxy-ovn-metrics" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662866 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662875 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="sbdb" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662885 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662895 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="nbdb" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662909 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662920 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovn-acl-logging" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662929 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="northd" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662941 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="kube-rbac-proxy-node" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.662952 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovn-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: E0201 19:43:36.663088 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.663098 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.663202 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.663214 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc5aa88-1337-4077-a820-f13902334c80" containerName="ovnkube-controller" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.665192 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.709766 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-script-lib\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.709866 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-kubelet\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.709907 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-var-lib-openvswitch\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.709927 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-bin\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.709950 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-netns\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.709987 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-slash\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710023 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-etc-openvswitch\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710019 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710055 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710071 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-config\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710087 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710098 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710102 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-log-socket\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710093 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-slash" (OuterVolumeSpecName: "host-slash") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710163 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-log-socket" (OuterVolumeSpecName: "log-socket") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710178 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710206 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dc5aa88-1337-4077-a820-f13902334c80-ovn-node-metrics-cert\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710235 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-netd\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710262 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-openvswitch\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710281 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-systemd\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710305 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-env-overrides\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710312 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710329 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-systemd-units\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710360 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710367 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710389 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-ovn\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710405 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710418 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-node-log\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710433 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710437 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710440 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710459 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-node-log" (OuterVolumeSpecName: "node-log") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710441 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg76h\" (UniqueName: \"kubernetes.io/projected/1dc5aa88-1337-4077-a820-f13902334c80-kube-api-access-vg76h\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710500 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-ovn-kubernetes\") pod \"1dc5aa88-1337-4077-a820-f13902334c80\" (UID: \"1dc5aa88-1337-4077-a820-f13902334c80\") " Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710651 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710708 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-etc-openvswitch\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710735 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-run-openvswitch\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710738 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710757 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-run-ovn\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710848 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-systemd-units\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710876 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs899\" (UniqueName: \"kubernetes.io/projected/71edee81-db8a-4091-88df-763f7a1cc9b4-kube-api-access-cs899\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710899 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-kubelet\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710927 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/71edee81-db8a-4091-88df-763f7a1cc9b4-ovnkube-config\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710951 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/71edee81-db8a-4091-88df-763f7a1cc9b4-ovn-node-metrics-cert\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710974 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-node-log\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711003 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-run-netns\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711056 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-slash\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711076 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711103 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/71edee81-db8a-4091-88df-763f7a1cc9b4-ovnkube-script-lib\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711163 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/71edee81-db8a-4091-88df-763f7a1cc9b4-env-overrides\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.710293 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711298 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711321 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-cni-netd\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711337 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-var-lib-openvswitch\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711368 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-run-systemd\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711419 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-log-socket\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711476 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-cni-bin\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711567 4961 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711590 4961 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711601 4961 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711611 4961 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711623 4961 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711636 4961 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711647 4961 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-node-log\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711657 4961 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711669 4961 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711679 4961 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711688 4961 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711698 4961 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711708 4961 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711718 4961 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-host-slash\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711729 4961 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711738 4961 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1dc5aa88-1337-4077-a820-f13902334c80-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.711748 4961 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-log-socket\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.715183 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc5aa88-1337-4077-a820-f13902334c80-kube-api-access-vg76h" (OuterVolumeSpecName: "kube-api-access-vg76h") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "kube-api-access-vg76h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.715474 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dc5aa88-1337-4077-a820-f13902334c80-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.722420 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1dc5aa88-1337-4077-a820-f13902334c80" (UID: "1dc5aa88-1337-4077-a820-f13902334c80"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818694 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-etc-openvswitch\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818743 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-run-ovn\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818762 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-run-openvswitch\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818779 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs899\" (UniqueName: \"kubernetes.io/projected/71edee81-db8a-4091-88df-763f7a1cc9b4-kube-api-access-cs899\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818799 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-systemd-units\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818813 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-kubelet\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818833 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/71edee81-db8a-4091-88df-763f7a1cc9b4-ovnkube-config\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818849 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/71edee81-db8a-4091-88df-763f7a1cc9b4-ovn-node-metrics-cert\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818865 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-node-log\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818885 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-run-netns\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818901 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-slash\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818918 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818959 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.818988 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-etc-openvswitch\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.819015 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-run-ovn\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.819052 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-run-openvswitch\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.819335 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-node-log\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.819339 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-slash\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.819369 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/71edee81-db8a-4091-88df-763f7a1cc9b4-ovnkube-script-lib\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.819429 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/71edee81-db8a-4091-88df-763f7a1cc9b4-env-overrides\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.819496 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.819932 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/71edee81-db8a-4091-88df-763f7a1cc9b4-ovnkube-script-lib\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.819951 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-cni-netd\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.819976 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-var-lib-openvswitch\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820017 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-run-systemd\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820062 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-log-socket\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820095 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-cni-bin\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820188 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-cni-bin\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820364 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/71edee81-db8a-4091-88df-763f7a1cc9b4-ovnkube-config\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.819846 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-kubelet\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820520 4961 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1dc5aa88-1337-4077-a820-f13902334c80-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820534 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-systemd-units\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820549 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-run-systemd\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820567 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-var-lib-openvswitch\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820572 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-run-netns\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820583 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-log-socket\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820593 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-cni-netd\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820606 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71edee81-db8a-4091-88df-763f7a1cc9b4-host-run-ovn-kubernetes\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820621 4961 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1dc5aa88-1337-4077-a820-f13902334c80-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820631 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg76h\" (UniqueName: \"kubernetes.io/projected/1dc5aa88-1337-4077-a820-f13902334c80-kube-api-access-vg76h\") on node \"crc\" DevicePath \"\"" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.820749 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/71edee81-db8a-4091-88df-763f7a1cc9b4-env-overrides\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.822893 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/71edee81-db8a-4091-88df-763f7a1cc9b4-ovn-node-metrics-cert\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.835451 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs899\" (UniqueName: \"kubernetes.io/projected/71edee81-db8a-4091-88df-763f7a1cc9b4-kube-api-access-cs899\") pod \"ovnkube-node-jmlkm\" (UID: \"71edee81-db8a-4091-88df-763f7a1cc9b4\") " pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:36 crc kubenswrapper[4961]: I0201 19:43:36.981133 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.223364 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovn-acl-logging/0.log" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.223853 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ng92q_1dc5aa88-1337-4077-a820-f13902334c80/ovn-controller/0.log" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.224294 4961 generic.go:334] "Generic (PLEG): container finished" podID="1dc5aa88-1337-4077-a820-f13902334c80" containerID="6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea" exitCode=0 Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.224340 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea"} Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.224371 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" event={"ID":"1dc5aa88-1337-4077-a820-f13902334c80","Type":"ContainerDied","Data":"6f1ef0b4b17a5cf9cee32bbc5c6d5e1c67b0d30fb8254462065740c74bccf6cc"} Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.224392 4961 scope.go:117] "RemoveContainer" containerID="82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.224423 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ng92q" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.226699 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wcr65_9a7db599-5d41-4e19-b703-6bb56822b4ff/kube-multus/2.log" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.228405 4961 generic.go:334] "Generic (PLEG): container finished" podID="71edee81-db8a-4091-88df-763f7a1cc9b4" containerID="780b9e452ebf59142d5f344fdf8b68dd8d989547d564069fbfa7476fa6fe7c1f" exitCode=0 Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.228434 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" event={"ID":"71edee81-db8a-4091-88df-763f7a1cc9b4","Type":"ContainerDied","Data":"780b9e452ebf59142d5f344fdf8b68dd8d989547d564069fbfa7476fa6fe7c1f"} Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.228702 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" event={"ID":"71edee81-db8a-4091-88df-763f7a1cc9b4","Type":"ContainerStarted","Data":"7503b2388d5439a09ba49fae431cf41e91293af4f20004d9299d1283971c0325"} Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.252627 4961 scope.go:117] "RemoveContainer" containerID="c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.283134 4961 scope.go:117] "RemoveContainer" containerID="3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.301176 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ng92q"] Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.303004 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ng92q"] Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.320431 4961 scope.go:117] "RemoveContainer" containerID="6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.345148 4961 scope.go:117] "RemoveContainer" containerID="1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.370791 4961 scope.go:117] "RemoveContainer" containerID="4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.383753 4961 scope.go:117] "RemoveContainer" containerID="1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.398187 4961 scope.go:117] "RemoveContainer" containerID="3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.410777 4961 scope.go:117] "RemoveContainer" containerID="50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.426081 4961 scope.go:117] "RemoveContainer" containerID="82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4" Feb 01 19:43:37 crc kubenswrapper[4961]: E0201 19:43:37.427434 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4\": container with ID starting with 82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4 not found: ID does not exist" containerID="82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.427463 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4"} err="failed to get container status \"82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4\": rpc error: code = NotFound desc = could not find container \"82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4\": container with ID starting with 82b34442fd8f1211c4c595b6063ce680e1020052c45d8fd30af7ac31053be7d4 not found: ID does not exist" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.427492 4961 scope.go:117] "RemoveContainer" containerID="c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3" Feb 01 19:43:37 crc kubenswrapper[4961]: E0201 19:43:37.427982 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\": container with ID starting with c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3 not found: ID does not exist" containerID="c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.428005 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3"} err="failed to get container status \"c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\": rpc error: code = NotFound desc = could not find container \"c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3\": container with ID starting with c5dfbb5ddb3ef3e7486148a241c58cde8ac0baa7ac5b328e6eb5f792d59579f3 not found: ID does not exist" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.428021 4961 scope.go:117] "RemoveContainer" containerID="3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65" Feb 01 19:43:37 crc kubenswrapper[4961]: E0201 19:43:37.428262 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\": container with ID starting with 3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65 not found: ID does not exist" containerID="3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.428282 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65"} err="failed to get container status \"3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\": rpc error: code = NotFound desc = could not find container \"3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65\": container with ID starting with 3f044232f97965495d1c102b9d1b47ec71ecf3df307b09d9965cf78e1d6a3e65 not found: ID does not exist" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.428299 4961 scope.go:117] "RemoveContainer" containerID="6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea" Feb 01 19:43:37 crc kubenswrapper[4961]: E0201 19:43:37.428554 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\": container with ID starting with 6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea not found: ID does not exist" containerID="6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.428577 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea"} err="failed to get container status \"6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\": rpc error: code = NotFound desc = could not find container \"6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea\": container with ID starting with 6cc71334e5e4804a54df42c7d57b6d96cc9e037c79c2c5a2f594b5b1683b77ea not found: ID does not exist" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.428601 4961 scope.go:117] "RemoveContainer" containerID="1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c" Feb 01 19:43:37 crc kubenswrapper[4961]: E0201 19:43:37.428956 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\": container with ID starting with 1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c not found: ID does not exist" containerID="1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.428978 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c"} err="failed to get container status \"1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\": rpc error: code = NotFound desc = could not find container \"1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c\": container with ID starting with 1dbeea3ad22d68d2fa3ba16ef03ba48766b74d3d56e4da955a729bfb484d501c not found: ID does not exist" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.428989 4961 scope.go:117] "RemoveContainer" containerID="4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054" Feb 01 19:43:37 crc kubenswrapper[4961]: E0201 19:43:37.429731 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\": container with ID starting with 4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054 not found: ID does not exist" containerID="4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.429774 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054"} err="failed to get container status \"4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\": rpc error: code = NotFound desc = could not find container \"4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054\": container with ID starting with 4b441fc164795fd2d24094dcb854f904f5feb3b10ce2aeee80acc8a1d9043054 not found: ID does not exist" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.429796 4961 scope.go:117] "RemoveContainer" containerID="1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311" Feb 01 19:43:37 crc kubenswrapper[4961]: E0201 19:43:37.430307 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\": container with ID starting with 1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311 not found: ID does not exist" containerID="1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.430333 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311"} err="failed to get container status \"1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\": rpc error: code = NotFound desc = could not find container \"1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311\": container with ID starting with 1930036d772317d0e069af2ee5bb16c812228126bb8e19e44d9e6a686e739311 not found: ID does not exist" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.430350 4961 scope.go:117] "RemoveContainer" containerID="3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9" Feb 01 19:43:37 crc kubenswrapper[4961]: E0201 19:43:37.430622 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\": container with ID starting with 3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9 not found: ID does not exist" containerID="3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.430651 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9"} err="failed to get container status \"3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\": rpc error: code = NotFound desc = could not find container \"3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9\": container with ID starting with 3ff6940dc03daf1682e7d77506f40e8fbbc7178f4f69593f6f99447fad1845e9 not found: ID does not exist" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.430667 4961 scope.go:117] "RemoveContainer" containerID="50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b" Feb 01 19:43:37 crc kubenswrapper[4961]: E0201 19:43:37.431487 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\": container with ID starting with 50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b not found: ID does not exist" containerID="50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b" Feb 01 19:43:37 crc kubenswrapper[4961]: I0201 19:43:37.431511 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b"} err="failed to get container status \"50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\": rpc error: code = NotFound desc = could not find container \"50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b\": container with ID starting with 50f3faaedbca37d6d7a848a8728709121756be8d5021b4f32a79f92ac228872b not found: ID does not exist" Feb 01 19:43:38 crc kubenswrapper[4961]: I0201 19:43:38.240551 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc5aa88-1337-4077-a820-f13902334c80" path="/var/lib/kubelet/pods/1dc5aa88-1337-4077-a820-f13902334c80/volumes" Feb 01 19:43:38 crc kubenswrapper[4961]: I0201 19:43:38.241715 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" event={"ID":"71edee81-db8a-4091-88df-763f7a1cc9b4","Type":"ContainerStarted","Data":"4012f098740de56f389221187ae8692255f3abb2496ac98ce471ff8d3e943f7a"} Feb 01 19:43:38 crc kubenswrapper[4961]: I0201 19:43:38.241741 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" event={"ID":"71edee81-db8a-4091-88df-763f7a1cc9b4","Type":"ContainerStarted","Data":"80208e86423553f8f99194669c5ca15ded4f659dc9c7359abdfe478a55f5c2e7"} Feb 01 19:43:38 crc kubenswrapper[4961]: I0201 19:43:38.241753 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" event={"ID":"71edee81-db8a-4091-88df-763f7a1cc9b4","Type":"ContainerStarted","Data":"f82eadeb2cebac5cea85d7456bbf8bbb3e05bbe06babf13b6b7a25dade341905"} Feb 01 19:43:38 crc kubenswrapper[4961]: I0201 19:43:38.241764 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" event={"ID":"71edee81-db8a-4091-88df-763f7a1cc9b4","Type":"ContainerStarted","Data":"f48722331860b876b500305aa81e03b42e017e3774b4ad68ea217ae6da9ce0d9"} Feb 01 19:43:38 crc kubenswrapper[4961]: I0201 19:43:38.241777 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" event={"ID":"71edee81-db8a-4091-88df-763f7a1cc9b4","Type":"ContainerStarted","Data":"bf3491daeb00fd019eda4eb7044b32d0e5d5fa6671f62137a2548d5ad347fde5"} Feb 01 19:43:38 crc kubenswrapper[4961]: I0201 19:43:38.241788 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" event={"ID":"71edee81-db8a-4091-88df-763f7a1cc9b4","Type":"ContainerStarted","Data":"d303a48130a02101947510ca8af04248479b0ed3b1a4d2cd9175e1376290b903"} Feb 01 19:43:41 crc kubenswrapper[4961]: I0201 19:43:41.272664 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" event={"ID":"71edee81-db8a-4091-88df-763f7a1cc9b4","Type":"ContainerStarted","Data":"d31eae6066db0963ec5c72f997e640ff87ed95db352aeae401546cea106f79e4"} Feb 01 19:43:43 crc kubenswrapper[4961]: I0201 19:43:43.288121 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" event={"ID":"71edee81-db8a-4091-88df-763f7a1cc9b4","Type":"ContainerStarted","Data":"a398464c9b82ddc31b409c0d5aacfd8f1dcb689980a855006faa77b02137420b"} Feb 01 19:43:43 crc kubenswrapper[4961]: I0201 19:43:43.288549 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:43 crc kubenswrapper[4961]: I0201 19:43:43.288564 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:43 crc kubenswrapper[4961]: I0201 19:43:43.288572 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:43 crc kubenswrapper[4961]: I0201 19:43:43.314359 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:43 crc kubenswrapper[4961]: I0201 19:43:43.315854 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:43:43 crc kubenswrapper[4961]: I0201 19:43:43.316674 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" podStartSLOduration=7.316663254 podStartE2EDuration="7.316663254s" podCreationTimestamp="2026-02-01 19:43:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-01 19:43:43.316250185 +0000 UTC m=+577.841281628" watchObservedRunningTime="2026-02-01 19:43:43.316663254 +0000 UTC m=+577.841694677" Feb 01 19:43:47 crc kubenswrapper[4961]: I0201 19:43:47.236701 4961 scope.go:117] "RemoveContainer" containerID="bd2cc8257fc155aebcaa80268feee1ab6dfb4db90cc47bc8926ec8cb0bb5fd06" Feb 01 19:43:47 crc kubenswrapper[4961]: E0201 19:43:47.237405 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wcr65_openshift-multus(9a7db599-5d41-4e19-b703-6bb56822b4ff)\"" pod="openshift-multus/multus-wcr65" podUID="9a7db599-5d41-4e19-b703-6bb56822b4ff" Feb 01 19:43:50 crc kubenswrapper[4961]: I0201 19:43:50.282023 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:43:50 crc kubenswrapper[4961]: I0201 19:43:50.282183 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:43:50 crc kubenswrapper[4961]: I0201 19:43:50.282288 4961 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:43:50 crc kubenswrapper[4961]: I0201 19:43:50.283974 4961 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"972c05ccdf151b9ff7d0c1158a0bac29ca46456db3fb0cfe800bc1f1dbd56707"} pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 19:43:50 crc kubenswrapper[4961]: I0201 19:43:50.284085 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" containerID="cri-o://972c05ccdf151b9ff7d0c1158a0bac29ca46456db3fb0cfe800bc1f1dbd56707" gracePeriod=600 Feb 01 19:43:51 crc kubenswrapper[4961]: I0201 19:43:51.339294 4961 generic.go:334] "Generic (PLEG): container finished" podID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerID="972c05ccdf151b9ff7d0c1158a0bac29ca46456db3fb0cfe800bc1f1dbd56707" exitCode=0 Feb 01 19:43:51 crc kubenswrapper[4961]: I0201 19:43:51.339374 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerDied","Data":"972c05ccdf151b9ff7d0c1158a0bac29ca46456db3fb0cfe800bc1f1dbd56707"} Feb 01 19:43:51 crc kubenswrapper[4961]: I0201 19:43:51.340390 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerStarted","Data":"e56e6851aa710a0556f0bc0d0d6be33becd254ce5bb21ee7d558efdcdd9f165c"} Feb 01 19:43:51 crc kubenswrapper[4961]: I0201 19:43:51.340439 4961 scope.go:117] "RemoveContainer" containerID="4e080d5473795bf00d433b386421de436789648c183cfb19241bd3f1c6492a31" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.296110 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph"] Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.297443 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.301797 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-l2kdp" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.301825 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.301915 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.349867 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/50c560b9-c245-40cc-81aa-2bfdb99424a5-log\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.349913 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjx99\" (UniqueName: \"kubernetes.io/projected/50c560b9-c245-40cc-81aa-2bfdb99424a5-kube-api-access-vjx99\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.349942 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/50c560b9-c245-40cc-81aa-2bfdb99424a5-run\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.350097 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/50c560b9-c245-40cc-81aa-2bfdb99424a5-data\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.453741 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/50c560b9-c245-40cc-81aa-2bfdb99424a5-log\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.453804 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjx99\" (UniqueName: \"kubernetes.io/projected/50c560b9-c245-40cc-81aa-2bfdb99424a5-kube-api-access-vjx99\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.453834 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/50c560b9-c245-40cc-81aa-2bfdb99424a5-run\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.453862 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/50c560b9-c245-40cc-81aa-2bfdb99424a5-data\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.454435 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/50c560b9-c245-40cc-81aa-2bfdb99424a5-data\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.454655 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/50c560b9-c245-40cc-81aa-2bfdb99424a5-run\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.454909 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/50c560b9-c245-40cc-81aa-2bfdb99424a5-log\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.478868 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjx99\" (UniqueName: \"kubernetes.io/projected/50c560b9-c245-40cc-81aa-2bfdb99424a5-kube-api-access-vjx99\") pod \"ceph\" (UID: \"50c560b9-c245-40cc-81aa-2bfdb99424a5\") " pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: I0201 19:43:58.612427 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Feb 01 19:43:58 crc kubenswrapper[4961]: E0201 19:43:58.662588 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:43:58 crc kubenswrapper[4961]: E0201 19:43:58.675997 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:43:59 crc kubenswrapper[4961]: I0201 19:43:59.394232 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"50c560b9-c245-40cc-81aa-2bfdb99424a5","Type":"ContainerStarted","Data":"57a457e2e32925ea8677409c4ad06d6bfc18d84dc4b64e79855d925955ab18e6"} Feb 01 19:43:59 crc kubenswrapper[4961]: E0201 19:43:59.851950 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:43:59 crc kubenswrapper[4961]: E0201 19:43:59.864811 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:00 crc kubenswrapper[4961]: I0201 19:44:00.235025 4961 scope.go:117] "RemoveContainer" containerID="bd2cc8257fc155aebcaa80268feee1ab6dfb4db90cc47bc8926ec8cb0bb5fd06" Feb 01 19:44:00 crc kubenswrapper[4961]: I0201 19:44:00.401731 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wcr65_9a7db599-5d41-4e19-b703-6bb56822b4ff/kube-multus/2.log" Feb 01 19:44:01 crc kubenswrapper[4961]: E0201 19:44:01.029807 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:01 crc kubenswrapper[4961]: E0201 19:44:01.040719 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:01 crc kubenswrapper[4961]: I0201 19:44:01.407958 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wcr65_9a7db599-5d41-4e19-b703-6bb56822b4ff/kube-multus/2.log" Feb 01 19:44:01 crc kubenswrapper[4961]: I0201 19:44:01.408015 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wcr65" event={"ID":"9a7db599-5d41-4e19-b703-6bb56822b4ff","Type":"ContainerStarted","Data":"2f2fc941fd766ab186f016ae7a2ca2adfea96029459a3a2c05475c9524f864a9"} Feb 01 19:44:02 crc kubenswrapper[4961]: E0201 19:44:02.193389 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:02 crc kubenswrapper[4961]: E0201 19:44:02.211424 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:03 crc kubenswrapper[4961]: E0201 19:44:03.366973 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:03 crc kubenswrapper[4961]: E0201 19:44:03.378026 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:04 crc kubenswrapper[4961]: E0201 19:44:04.556113 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:04 crc kubenswrapper[4961]: E0201 19:44:04.570346 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:05 crc kubenswrapper[4961]: E0201 19:44:05.710763 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:05 crc kubenswrapper[4961]: E0201 19:44:05.723653 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:06 crc kubenswrapper[4961]: I0201 19:44:06.435831 4961 scope.go:117] "RemoveContainer" containerID="ec4ebc0d368d1cdf7880a2eed4a1c89b68e6d2e356c7da0600cb401b91ba009e" Feb 01 19:44:06 crc kubenswrapper[4961]: E0201 19:44:06.867936 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:06 crc kubenswrapper[4961]: E0201 19:44:06.881262 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:07 crc kubenswrapper[4961]: I0201 19:44:07.011661 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jmlkm" Feb 01 19:44:08 crc kubenswrapper[4961]: E0201 19:44:08.049614 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:08 crc kubenswrapper[4961]: E0201 19:44:08.061370 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:09 crc kubenswrapper[4961]: E0201 19:44:09.234399 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:09 crc kubenswrapper[4961]: E0201 19:44:09.247066 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:10 crc kubenswrapper[4961]: E0201 19:44:10.460368 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:10 crc kubenswrapper[4961]: E0201 19:44:10.477604 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:11 crc kubenswrapper[4961]: E0201 19:44:11.634886 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:11 crc kubenswrapper[4961]: E0201 19:44:11.658186 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:12 crc kubenswrapper[4961]: E0201 19:44:12.805893 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:12 crc kubenswrapper[4961]: E0201 19:44:12.822245 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:14 crc kubenswrapper[4961]: E0201 19:44:13.999742 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:14 crc kubenswrapper[4961]: E0201 19:44:14.010652 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:15 crc kubenswrapper[4961]: E0201 19:44:15.179139 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:15 crc kubenswrapper[4961]: E0201 19:44:15.198944 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:16 crc kubenswrapper[4961]: E0201 19:44:16.339689 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:16 crc kubenswrapper[4961]: E0201 19:44:16.353895 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:17 crc kubenswrapper[4961]: E0201 19:44:17.543302 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:17 crc kubenswrapper[4961]: E0201 19:44:17.558193 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:18 crc kubenswrapper[4961]: E0201 19:44:18.752258 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:18 crc kubenswrapper[4961]: E0201 19:44:18.768953 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:20 crc kubenswrapper[4961]: E0201 19:44:20.022921 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:20 crc kubenswrapper[4961]: I0201 19:44:20.029823 4961 scope.go:117] "RemoveContainer" containerID="853b97815c74999678a2dd0ba0d12a36a111c4fb24715e2f08b70ba74263c25b" Feb 01 19:44:20 crc kubenswrapper[4961]: E0201 19:44:20.045082 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:21 crc kubenswrapper[4961]: E0201 19:44:21.005733 4961 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/ceph/demo:latest-squid" Feb 01 19:44:21 crc kubenswrapper[4961]: E0201 19:44:21.006426 4961 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceph,Image:quay.io/ceph/demo:latest-squid,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:MON_IP,Value:192.168.126.11,ValueFrom:nil,},EnvVar{Name:CEPH_DAEMON,Value:demo,ValueFrom:nil,},EnvVar{Name:CEPH_PUBLIC_NETWORK,Value:0.0.0.0/0,ValueFrom:nil,},EnvVar{Name:DEMO_DAEMONS,Value:osd,mds,rgw,ValueFrom:nil,},EnvVar{Name:CEPH_DEMO_UID,Value:0,ValueFrom:nil,},EnvVar{Name:RGW_NAME,Value:ceph,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:data,ReadOnly:false,MountPath:/var/lib/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run,ReadOnly:false,MountPath:/run/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vjx99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceph_openstack(50c560b9-c245-40cc-81aa-2bfdb99424a5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 01 19:44:21 crc kubenswrapper[4961]: E0201 19:44:21.007653 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceph" podUID="50c560b9-c245-40cc-81aa-2bfdb99424a5" Feb 01 19:44:21 crc kubenswrapper[4961]: E0201 19:44:21.219502 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:21 crc kubenswrapper[4961]: E0201 19:44:21.239489 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:21 crc kubenswrapper[4961]: E0201 19:44:21.534228 4961 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/ceph/demo:latest-squid\\\"\"" pod="openstack/ceph" podUID="50c560b9-c245-40cc-81aa-2bfdb99424a5" Feb 01 19:44:22 crc kubenswrapper[4961]: E0201 19:44:22.464252 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:22 crc kubenswrapper[4961]: E0201 19:44:22.485593 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:23 crc kubenswrapper[4961]: E0201 19:44:23.681854 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:23 crc kubenswrapper[4961]: E0201 19:44:23.699477 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:24 crc kubenswrapper[4961]: E0201 19:44:24.912539 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:24 crc kubenswrapper[4961]: E0201 19:44:24.927687 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:26 crc kubenswrapper[4961]: E0201 19:44:26.099195 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:26 crc kubenswrapper[4961]: E0201 19:44:26.119600 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:27 crc kubenswrapper[4961]: E0201 19:44:27.294449 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:27 crc kubenswrapper[4961]: E0201 19:44:27.316538 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:28 crc kubenswrapper[4961]: E0201 19:44:28.517864 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:28 crc kubenswrapper[4961]: E0201 19:44:28.539487 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:29 crc kubenswrapper[4961]: E0201 19:44:29.684740 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:29 crc kubenswrapper[4961]: E0201 19:44:29.696906 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:30 crc kubenswrapper[4961]: E0201 19:44:30.894337 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:30 crc kubenswrapper[4961]: E0201 19:44:30.916666 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:32 crc kubenswrapper[4961]: E0201 19:44:32.119354 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:32 crc kubenswrapper[4961]: E0201 19:44:32.140626 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:33 crc kubenswrapper[4961]: E0201 19:44:33.284978 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:33 crc kubenswrapper[4961]: E0201 19:44:33.305962 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:34 crc kubenswrapper[4961]: E0201 19:44:34.514019 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:34 crc kubenswrapper[4961]: E0201 19:44:34.544433 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:35 crc kubenswrapper[4961]: E0201 19:44:35.765557 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:35 crc kubenswrapper[4961]: E0201 19:44:35.819981 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:36 crc kubenswrapper[4961]: I0201 19:44:36.672787 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"50c560b9-c245-40cc-81aa-2bfdb99424a5","Type":"ContainerStarted","Data":"a38a7be94a3ae2c8cff71ecfbfcb0440df7cb9e41aaa845f7fd959390b6ad21e"} Feb 01 19:44:36 crc kubenswrapper[4961]: I0201 19:44:36.686779 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph" podStartSLOduration=1.549987569 podStartE2EDuration="38.686746152s" podCreationTimestamp="2026-02-01 19:43:58 +0000 UTC" firstStartedPulling="2026-02-01 19:43:58.633914089 +0000 UTC m=+593.158945512" lastFinishedPulling="2026-02-01 19:44:35.770672672 +0000 UTC m=+630.295704095" observedRunningTime="2026-02-01 19:44:36.685852769 +0000 UTC m=+631.210884202" watchObservedRunningTime="2026-02-01 19:44:36.686746152 +0000 UTC m=+631.211777585" Feb 01 19:44:36 crc kubenswrapper[4961]: E0201 19:44:36.997027 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:37 crc kubenswrapper[4961]: E0201 19:44:37.015597 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:38 crc kubenswrapper[4961]: E0201 19:44:38.248969 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:38 crc kubenswrapper[4961]: E0201 19:44:38.269211 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:39 crc kubenswrapper[4961]: E0201 19:44:39.448344 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:39 crc kubenswrapper[4961]: E0201 19:44:39.471291 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:40 crc kubenswrapper[4961]: E0201 19:44:40.631540 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:40 crc kubenswrapper[4961]: E0201 19:44:40.655380 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:41 crc kubenswrapper[4961]: E0201 19:44:41.854294 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:41 crc kubenswrapper[4961]: E0201 19:44:41.876276 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:43 crc kubenswrapper[4961]: E0201 19:44:43.040328 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:43 crc kubenswrapper[4961]: E0201 19:44:43.055122 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:44 crc kubenswrapper[4961]: E0201 19:44:44.225086 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:44 crc kubenswrapper[4961]: E0201 19:44:44.249412 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:45 crc kubenswrapper[4961]: E0201 19:44:45.445367 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:45 crc kubenswrapper[4961]: E0201 19:44:45.459801 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:46 crc kubenswrapper[4961]: E0201 19:44:46.596655 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:46 crc kubenswrapper[4961]: E0201 19:44:46.618694 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:47 crc kubenswrapper[4961]: E0201 19:44:47.801066 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:47 crc kubenswrapper[4961]: E0201 19:44:47.821235 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:48 crc kubenswrapper[4961]: E0201 19:44:48.962320 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:48 crc kubenswrapper[4961]: E0201 19:44:48.982318 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:50 crc kubenswrapper[4961]: E0201 19:44:50.185002 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:50 crc kubenswrapper[4961]: E0201 19:44:50.199651 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:51 crc kubenswrapper[4961]: E0201 19:44:51.415465 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:51 crc kubenswrapper[4961]: E0201 19:44:51.439539 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:52 crc kubenswrapper[4961]: E0201 19:44:52.634695 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:52 crc kubenswrapper[4961]: E0201 19:44:52.648099 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:53 crc kubenswrapper[4961]: E0201 19:44:53.878252 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:53 crc kubenswrapper[4961]: E0201 19:44:53.900255 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:55 crc kubenswrapper[4961]: E0201 19:44:55.048864 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:55 crc kubenswrapper[4961]: E0201 19:44:55.062646 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:56 crc kubenswrapper[4961]: E0201 19:44:56.267688 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:56 crc kubenswrapper[4961]: E0201 19:44:56.288279 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:57 crc kubenswrapper[4961]: E0201 19:44:57.421775 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:57 crc kubenswrapper[4961]: E0201 19:44:57.446111 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:58 crc kubenswrapper[4961]: E0201 19:44:58.612127 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:58 crc kubenswrapper[4961]: E0201 19:44:58.627679 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:59 crc kubenswrapper[4961]: E0201 19:44:59.838918 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:44:59 crc kubenswrapper[4961]: E0201 19:44:59.856930 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.188184 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756"] Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.189718 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.191969 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.198686 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.201891 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756"] Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.371935 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j2fg\" (UniqueName: \"kubernetes.io/projected/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-kube-api-access-5j2fg\") pod \"collect-profiles-29499585-kh756\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.371987 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-config-volume\") pod \"collect-profiles-29499585-kh756\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.372015 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-secret-volume\") pod \"collect-profiles-29499585-kh756\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.473321 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j2fg\" (UniqueName: \"kubernetes.io/projected/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-kube-api-access-5j2fg\") pod \"collect-profiles-29499585-kh756\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.473370 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-config-volume\") pod \"collect-profiles-29499585-kh756\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.473397 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-secret-volume\") pod \"collect-profiles-29499585-kh756\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.474908 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-config-volume\") pod \"collect-profiles-29499585-kh756\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.480303 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-secret-volume\") pod \"collect-profiles-29499585-kh756\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.500719 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j2fg\" (UniqueName: \"kubernetes.io/projected/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-kube-api-access-5j2fg\") pod \"collect-profiles-29499585-kh756\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.516265 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:00 crc kubenswrapper[4961]: I0201 19:45:00.902476 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756"] Feb 01 19:45:01 crc kubenswrapper[4961]: E0201 19:45:01.095002 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:01 crc kubenswrapper[4961]: E0201 19:45:01.110175 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:01 crc kubenswrapper[4961]: I0201 19:45:01.849701 4961 generic.go:334] "Generic (PLEG): container finished" podID="cb8a7037-aab7-4067-b850-bbbc6b9cd12c" containerID="4cdcbb2f010d9bb93e47f63827c2d44cee0e9223eed0f40eddbfe05d054b68a0" exitCode=0 Feb 01 19:45:01 crc kubenswrapper[4961]: I0201 19:45:01.849773 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" event={"ID":"cb8a7037-aab7-4067-b850-bbbc6b9cd12c","Type":"ContainerDied","Data":"4cdcbb2f010d9bb93e47f63827c2d44cee0e9223eed0f40eddbfe05d054b68a0"} Feb 01 19:45:01 crc kubenswrapper[4961]: I0201 19:45:01.849822 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" event={"ID":"cb8a7037-aab7-4067-b850-bbbc6b9cd12c","Type":"ContainerStarted","Data":"8ada3bf36803f8c442d52130e8c3ad0178c3c4b15b6c9b2242765a187d12da0e"} Feb 01 19:45:02 crc kubenswrapper[4961]: E0201 19:45:02.295328 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:02 crc kubenswrapper[4961]: E0201 19:45:02.313561 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.171161 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.311285 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j2fg\" (UniqueName: \"kubernetes.io/projected/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-kube-api-access-5j2fg\") pod \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.311490 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-config-volume\") pod \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.311620 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-secret-volume\") pod \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\" (UID: \"cb8a7037-aab7-4067-b850-bbbc6b9cd12c\") " Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.312139 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-config-volume" (OuterVolumeSpecName: "config-volume") pod "cb8a7037-aab7-4067-b850-bbbc6b9cd12c" (UID: "cb8a7037-aab7-4067-b850-bbbc6b9cd12c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.312375 4961 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.317824 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cb8a7037-aab7-4067-b850-bbbc6b9cd12c" (UID: "cb8a7037-aab7-4067-b850-bbbc6b9cd12c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.321316 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-kube-api-access-5j2fg" (OuterVolumeSpecName: "kube-api-access-5j2fg") pod "cb8a7037-aab7-4067-b850-bbbc6b9cd12c" (UID: "cb8a7037-aab7-4067-b850-bbbc6b9cd12c"). InnerVolumeSpecName "kube-api-access-5j2fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.413322 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j2fg\" (UniqueName: \"kubernetes.io/projected/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-kube-api-access-5j2fg\") on node \"crc\" DevicePath \"\"" Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.413354 4961 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb8a7037-aab7-4067-b850-bbbc6b9cd12c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 01 19:45:03 crc kubenswrapper[4961]: E0201 19:45:03.466496 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:03 crc kubenswrapper[4961]: E0201 19:45:03.477948 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.865068 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.864982 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29499585-kh756" event={"ID":"cb8a7037-aab7-4067-b850-bbbc6b9cd12c","Type":"ContainerDied","Data":"8ada3bf36803f8c442d52130e8c3ad0178c3c4b15b6c9b2242765a187d12da0e"} Feb 01 19:45:03 crc kubenswrapper[4961]: I0201 19:45:03.873223 4961 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ada3bf36803f8c442d52130e8c3ad0178c3c4b15b6c9b2242765a187d12da0e" Feb 01 19:45:04 crc kubenswrapper[4961]: E0201 19:45:04.680636 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:04 crc kubenswrapper[4961]: E0201 19:45:04.695087 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:05 crc kubenswrapper[4961]: E0201 19:45:05.935808 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:05 crc kubenswrapper[4961]: E0201 19:45:05.952492 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:07 crc kubenswrapper[4961]: E0201 19:45:07.150259 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:07 crc kubenswrapper[4961]: E0201 19:45:07.168135 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:08 crc kubenswrapper[4961]: E0201 19:45:08.349467 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:08 crc kubenswrapper[4961]: E0201 19:45:08.368291 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:09 crc kubenswrapper[4961]: E0201 19:45:09.523535 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:09 crc kubenswrapper[4961]: E0201 19:45:09.536915 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:10 crc kubenswrapper[4961]: E0201 19:45:10.714207 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:10 crc kubenswrapper[4961]: E0201 19:45:10.736331 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:11 crc kubenswrapper[4961]: E0201 19:45:11.946547 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:11 crc kubenswrapper[4961]: E0201 19:45:11.968969 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:13 crc kubenswrapper[4961]: E0201 19:45:13.166559 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:13 crc kubenswrapper[4961]: E0201 19:45:13.185359 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:14 crc kubenswrapper[4961]: E0201 19:45:14.431118 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:14 crc kubenswrapper[4961]: E0201 19:45:14.474559 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:15 crc kubenswrapper[4961]: E0201 19:45:15.648121 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:15 crc kubenswrapper[4961]: E0201 19:45:15.670964 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:16 crc kubenswrapper[4961]: E0201 19:45:16.896111 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:16 crc kubenswrapper[4961]: E0201 19:45:16.917719 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:18 crc kubenswrapper[4961]: E0201 19:45:18.180524 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:18 crc kubenswrapper[4961]: E0201 19:45:18.200655 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:19 crc kubenswrapper[4961]: E0201 19:45:19.382308 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:19 crc kubenswrapper[4961]: E0201 19:45:19.399271 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:20 crc kubenswrapper[4961]: E0201 19:45:20.596164 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:20 crc kubenswrapper[4961]: E0201 19:45:20.614155 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:21 crc kubenswrapper[4961]: E0201 19:45:21.833137 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:21 crc kubenswrapper[4961]: E0201 19:45:21.857354 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:23 crc kubenswrapper[4961]: E0201 19:45:23.038043 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:23 crc kubenswrapper[4961]: E0201 19:45:23.049084 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:24 crc kubenswrapper[4961]: E0201 19:45:24.217453 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:24 crc kubenswrapper[4961]: E0201 19:45:24.238126 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:25 crc kubenswrapper[4961]: E0201 19:45:25.395763 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:25 crc kubenswrapper[4961]: E0201 19:45:25.416854 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:26 crc kubenswrapper[4961]: E0201 19:45:26.601953 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:26 crc kubenswrapper[4961]: E0201 19:45:26.618874 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:27 crc kubenswrapper[4961]: E0201 19:45:27.794982 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:27 crc kubenswrapper[4961]: E0201 19:45:27.816123 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:29 crc kubenswrapper[4961]: E0201 19:45:29.000781 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:29 crc kubenswrapper[4961]: E0201 19:45:29.022562 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:30 crc kubenswrapper[4961]: E0201 19:45:30.260122 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:30 crc kubenswrapper[4961]: E0201 19:45:30.279155 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:31 crc kubenswrapper[4961]: E0201 19:45:31.476954 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:31 crc kubenswrapper[4961]: E0201 19:45:31.493675 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:32 crc kubenswrapper[4961]: E0201 19:45:32.704819 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:32 crc kubenswrapper[4961]: E0201 19:45:32.723477 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:33 crc kubenswrapper[4961]: E0201 19:45:33.953363 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:33 crc kubenswrapper[4961]: E0201 19:45:33.974455 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:35 crc kubenswrapper[4961]: E0201 19:45:35.188401 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:35 crc kubenswrapper[4961]: E0201 19:45:35.208704 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:36 crc kubenswrapper[4961]: E0201 19:45:36.385010 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:36 crc kubenswrapper[4961]: E0201 19:45:36.395625 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:37 crc kubenswrapper[4961]: E0201 19:45:37.576621 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:37 crc kubenswrapper[4961]: E0201 19:45:37.594831 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:38 crc kubenswrapper[4961]: E0201 19:45:38.758507 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:38 crc kubenswrapper[4961]: E0201 19:45:38.772998 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:39 crc kubenswrapper[4961]: E0201 19:45:39.963093 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:39 crc kubenswrapper[4961]: E0201 19:45:39.976613 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:41 crc kubenswrapper[4961]: E0201 19:45:41.123080 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:41 crc kubenswrapper[4961]: E0201 19:45:41.146925 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:42 crc kubenswrapper[4961]: E0201 19:45:42.326633 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:42 crc kubenswrapper[4961]: E0201 19:45:42.344423 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:43 crc kubenswrapper[4961]: E0201 19:45:43.591964 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:43 crc kubenswrapper[4961]: E0201 19:45:43.613361 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:45 crc kubenswrapper[4961]: E0201 19:45:45.142137 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:45 crc kubenswrapper[4961]: E0201 19:45:45.156724 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:46 crc kubenswrapper[4961]: E0201 19:45:46.368814 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:46 crc kubenswrapper[4961]: E0201 19:45:46.387924 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:47 crc kubenswrapper[4961]: E0201 19:45:47.581023 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:47 crc kubenswrapper[4961]: E0201 19:45:47.600992 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:48 crc kubenswrapper[4961]: E0201 19:45:48.840790 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:48 crc kubenswrapper[4961]: E0201 19:45:48.863270 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:50 crc kubenswrapper[4961]: E0201 19:45:50.052443 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:50 crc kubenswrapper[4961]: E0201 19:45:50.071491 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:50 crc kubenswrapper[4961]: I0201 19:45:50.281846 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:45:50 crc kubenswrapper[4961]: I0201 19:45:50.282375 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:45:51 crc kubenswrapper[4961]: E0201 19:45:51.277070 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:51 crc kubenswrapper[4961]: E0201 19:45:51.297128 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:52 crc kubenswrapper[4961]: E0201 19:45:52.508503 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:52 crc kubenswrapper[4961]: E0201 19:45:52.531174 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:53 crc kubenswrapper[4961]: E0201 19:45:53.697853 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:53 crc kubenswrapper[4961]: E0201 19:45:53.715097 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:54 crc kubenswrapper[4961]: E0201 19:45:54.901160 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:54 crc kubenswrapper[4961]: E0201 19:45:54.921281 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:56 crc kubenswrapper[4961]: E0201 19:45:56.062135 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:56 crc kubenswrapper[4961]: E0201 19:45:56.074120 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:57 crc kubenswrapper[4961]: E0201 19:45:57.278243 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:57 crc kubenswrapper[4961]: E0201 19:45:57.302325 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:58 crc kubenswrapper[4961]: E0201 19:45:58.520094 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:58 crc kubenswrapper[4961]: E0201 19:45:58.538322 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:59 crc kubenswrapper[4961]: E0201 19:45:59.754918 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:45:59 crc kubenswrapper[4961]: E0201 19:45:59.779668 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:00 crc kubenswrapper[4961]: E0201 19:46:00.992262 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:01 crc kubenswrapper[4961]: E0201 19:46:01.014341 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:02 crc kubenswrapper[4961]: E0201 19:46:02.210632 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:02 crc kubenswrapper[4961]: E0201 19:46:02.226681 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:03 crc kubenswrapper[4961]: E0201 19:46:03.382069 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:03 crc kubenswrapper[4961]: E0201 19:46:03.399375 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:04 crc kubenswrapper[4961]: E0201 19:46:04.576445 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:04 crc kubenswrapper[4961]: E0201 19:46:04.590356 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:05 crc kubenswrapper[4961]: E0201 19:46:05.809631 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:05 crc kubenswrapper[4961]: E0201 19:46:05.823347 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:07 crc kubenswrapper[4961]: E0201 19:46:07.007911 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:07 crc kubenswrapper[4961]: E0201 19:46:07.045520 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:08 crc kubenswrapper[4961]: E0201 19:46:08.209801 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:08 crc kubenswrapper[4961]: E0201 19:46:08.228224 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:09 crc kubenswrapper[4961]: E0201 19:46:09.459358 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:09 crc kubenswrapper[4961]: E0201 19:46:09.480339 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:10 crc kubenswrapper[4961]: E0201 19:46:10.652292 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:10 crc kubenswrapper[4961]: E0201 19:46:10.674472 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:11 crc kubenswrapper[4961]: E0201 19:46:11.873156 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:11 crc kubenswrapper[4961]: E0201 19:46:11.890766 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:13 crc kubenswrapper[4961]: E0201 19:46:13.092770 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:13 crc kubenswrapper[4961]: E0201 19:46:13.113650 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:14 crc kubenswrapper[4961]: E0201 19:46:14.320398 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:14 crc kubenswrapper[4961]: E0201 19:46:14.342890 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:15 crc kubenswrapper[4961]: E0201 19:46:15.522471 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:15 crc kubenswrapper[4961]: E0201 19:46:15.537911 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:16 crc kubenswrapper[4961]: E0201 19:46:16.709888 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:16 crc kubenswrapper[4961]: E0201 19:46:16.723459 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:17 crc kubenswrapper[4961]: E0201 19:46:17.882572 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:17 crc kubenswrapper[4961]: E0201 19:46:17.894332 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:19 crc kubenswrapper[4961]: E0201 19:46:19.049465 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:19 crc kubenswrapper[4961]: E0201 19:46:19.072790 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:20 crc kubenswrapper[4961]: E0201 19:46:20.197501 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:20 crc kubenswrapper[4961]: E0201 19:46:20.212593 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:20 crc kubenswrapper[4961]: I0201 19:46:20.282704 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:46:20 crc kubenswrapper[4961]: I0201 19:46:20.282852 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:46:21 crc kubenswrapper[4961]: E0201 19:46:21.394934 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:21 crc kubenswrapper[4961]: E0201 19:46:21.412454 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:22 crc kubenswrapper[4961]: E0201 19:46:22.584995 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:22 crc kubenswrapper[4961]: E0201 19:46:22.602728 4961 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=3465721633730789685, SKID=, AKID=AE:B2:13:A9:EC:3F:7B:1E:81:F6:D3:D1:19:76:C5:04:4B:FA:8F:AA failed: x509: certificate signed by unknown authority" Feb 01 19:46:44 crc kubenswrapper[4961]: I0201 19:46:44.457324 4961 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 01 19:46:50 crc kubenswrapper[4961]: I0201 19:46:50.281306 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:46:50 crc kubenswrapper[4961]: I0201 19:46:50.281890 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:46:50 crc kubenswrapper[4961]: I0201 19:46:50.281941 4961 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:46:50 crc kubenswrapper[4961]: I0201 19:46:50.282535 4961 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e56e6851aa710a0556f0bc0d0d6be33becd254ce5bb21ee7d558efdcdd9f165c"} pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 19:46:50 crc kubenswrapper[4961]: I0201 19:46:50.282594 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" containerID="cri-o://e56e6851aa710a0556f0bc0d0d6be33becd254ce5bb21ee7d558efdcdd9f165c" gracePeriod=600 Feb 01 19:46:50 crc kubenswrapper[4961]: I0201 19:46:50.624894 4961 generic.go:334] "Generic (PLEG): container finished" podID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerID="e56e6851aa710a0556f0bc0d0d6be33becd254ce5bb21ee7d558efdcdd9f165c" exitCode=0 Feb 01 19:46:50 crc kubenswrapper[4961]: I0201 19:46:50.624917 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerDied","Data":"e56e6851aa710a0556f0bc0d0d6be33becd254ce5bb21ee7d558efdcdd9f165c"} Feb 01 19:46:50 crc kubenswrapper[4961]: I0201 19:46:50.624960 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerStarted","Data":"c9386281dc2f3b973aa5b52c2ba8dc6727d168fe4f4434cf40a6f362c6e0b2b8"} Feb 01 19:46:50 crc kubenswrapper[4961]: I0201 19:46:50.624980 4961 scope.go:117] "RemoveContainer" containerID="972c05ccdf151b9ff7d0c1158a0bac29ca46456db3fb0cfe800bc1f1dbd56707" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.766584 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mh6jk/must-gather-9nbrv"] Feb 01 19:46:53 crc kubenswrapper[4961]: E0201 19:46:53.767364 4961 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb8a7037-aab7-4067-b850-bbbc6b9cd12c" containerName="collect-profiles" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.767380 4961 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8a7037-aab7-4067-b850-bbbc6b9cd12c" containerName="collect-profiles" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.767527 4961 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb8a7037-aab7-4067-b850-bbbc6b9cd12c" containerName="collect-profiles" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.768195 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mh6jk/must-gather-9nbrv" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.769863 4961 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mh6jk"/"default-dockercfg-m9cxs" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.770162 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mh6jk"/"kube-root-ca.crt" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.770875 4961 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mh6jk"/"openshift-service-ca.crt" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.782716 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mh6jk/must-gather-9nbrv"] Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.809886 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qzzx\" (UniqueName: \"kubernetes.io/projected/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-kube-api-access-2qzzx\") pod \"must-gather-9nbrv\" (UID: \"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210\") " pod="openshift-must-gather-mh6jk/must-gather-9nbrv" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.809931 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-must-gather-output\") pod \"must-gather-9nbrv\" (UID: \"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210\") " pod="openshift-must-gather-mh6jk/must-gather-9nbrv" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.911004 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qzzx\" (UniqueName: \"kubernetes.io/projected/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-kube-api-access-2qzzx\") pod \"must-gather-9nbrv\" (UID: \"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210\") " pod="openshift-must-gather-mh6jk/must-gather-9nbrv" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.911069 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-must-gather-output\") pod \"must-gather-9nbrv\" (UID: \"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210\") " pod="openshift-must-gather-mh6jk/must-gather-9nbrv" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.911491 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-must-gather-output\") pod \"must-gather-9nbrv\" (UID: \"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210\") " pod="openshift-must-gather-mh6jk/must-gather-9nbrv" Feb 01 19:46:53 crc kubenswrapper[4961]: I0201 19:46:53.928647 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qzzx\" (UniqueName: \"kubernetes.io/projected/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-kube-api-access-2qzzx\") pod \"must-gather-9nbrv\" (UID: \"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210\") " pod="openshift-must-gather-mh6jk/must-gather-9nbrv" Feb 01 19:46:54 crc kubenswrapper[4961]: I0201 19:46:54.100691 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mh6jk/must-gather-9nbrv" Feb 01 19:46:54 crc kubenswrapper[4961]: I0201 19:46:54.338067 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mh6jk/must-gather-9nbrv"] Feb 01 19:46:54 crc kubenswrapper[4961]: I0201 19:46:54.650191 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mh6jk/must-gather-9nbrv" event={"ID":"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210","Type":"ContainerStarted","Data":"9a97016f524c827c337878b84e239fe4ddc8bd6ed2b215753c96863219f50f53"} Feb 01 19:46:59 crc kubenswrapper[4961]: I0201 19:46:59.681568 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mh6jk/must-gather-9nbrv" event={"ID":"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210","Type":"ContainerStarted","Data":"0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6"} Feb 01 19:47:00 crc kubenswrapper[4961]: I0201 19:47:00.690238 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mh6jk/must-gather-9nbrv" event={"ID":"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210","Type":"ContainerStarted","Data":"621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9"} Feb 01 19:47:00 crc kubenswrapper[4961]: I0201 19:47:00.709527 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mh6jk/must-gather-9nbrv" podStartSLOduration=2.709201485 podStartE2EDuration="7.709505186s" podCreationTimestamp="2026-02-01 19:46:53 +0000 UTC" firstStartedPulling="2026-02-01 19:46:54.355862975 +0000 UTC m=+768.880894408" lastFinishedPulling="2026-02-01 19:46:59.356166686 +0000 UTC m=+773.881198109" observedRunningTime="2026-02-01 19:47:00.706262159 +0000 UTC m=+775.231293602" watchObservedRunningTime="2026-02-01 19:47:00.709505186 +0000 UTC m=+775.234536619" Feb 01 19:47:13 crc kubenswrapper[4961]: I0201 19:47:13.031141 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_50c560b9-c245-40cc-81aa-2bfdb99424a5/ceph/0.log" Feb 01 19:47:36 crc kubenswrapper[4961]: I0201 19:47:36.684849 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xzdqx_547d4fdd-999f-4578-9efc-38155da0b2fe/control-plane-machine-set-operator/0.log" Feb 01 19:47:36 crc kubenswrapper[4961]: I0201 19:47:36.773996 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-68b9n_86a1f5dd-69f3-4ba2-b48f-14aef559e6b2/kube-rbac-proxy/0.log" Feb 01 19:47:36 crc kubenswrapper[4961]: I0201 19:47:36.802697 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-68b9n_86a1f5dd-69f3-4ba2-b48f-14aef559e6b2/machine-api-operator/0.log" Feb 01 19:47:49 crc kubenswrapper[4961]: I0201 19:47:49.229235 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-4hjhf_f8ee4e6a-e069-4269-bff1-bd1c5df23795/cert-manager-controller/0.log" Feb 01 19:47:49 crc kubenswrapper[4961]: I0201 19:47:49.394598 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-grqlz_0459a3cb-30ac-42fa-8de3-96d9a3d6dbe0/cert-manager-cainjector/0.log" Feb 01 19:47:49 crc kubenswrapper[4961]: I0201 19:47:49.403023 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-2h4t6_c026d1b9-4f13-4322-86de-e329b5e64b26/cert-manager-webhook/0.log" Feb 01 19:48:16 crc kubenswrapper[4961]: I0201 19:48:16.601758 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xfhh_e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52/extract-utilities/0.log" Feb 01 19:48:16 crc kubenswrapper[4961]: I0201 19:48:16.767775 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xfhh_e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52/extract-utilities/0.log" Feb 01 19:48:16 crc kubenswrapper[4961]: I0201 19:48:16.790327 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xfhh_e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52/extract-content/0.log" Feb 01 19:48:16 crc kubenswrapper[4961]: I0201 19:48:16.814409 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xfhh_e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52/extract-content/0.log" Feb 01 19:48:16 crc kubenswrapper[4961]: I0201 19:48:16.975220 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xfhh_e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52/extract-utilities/0.log" Feb 01 19:48:16 crc kubenswrapper[4961]: I0201 19:48:16.977858 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xfhh_e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52/extract-content/0.log" Feb 01 19:48:17 crc kubenswrapper[4961]: I0201 19:48:17.174061 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llcxm_94fdbb40-b51a-4305-8d95-a6b6a41f917d/extract-utilities/0.log" Feb 01 19:48:17 crc kubenswrapper[4961]: I0201 19:48:17.247359 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-8xfhh_e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52/registry-server/0.log" Feb 01 19:48:17 crc kubenswrapper[4961]: I0201 19:48:17.301786 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llcxm_94fdbb40-b51a-4305-8d95-a6b6a41f917d/extract-utilities/0.log" Feb 01 19:48:17 crc kubenswrapper[4961]: I0201 19:48:17.310974 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llcxm_94fdbb40-b51a-4305-8d95-a6b6a41f917d/extract-content/0.log" Feb 01 19:48:17 crc kubenswrapper[4961]: I0201 19:48:17.354112 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llcxm_94fdbb40-b51a-4305-8d95-a6b6a41f917d/extract-content/0.log" Feb 01 19:48:17 crc kubenswrapper[4961]: I0201 19:48:17.477113 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llcxm_94fdbb40-b51a-4305-8d95-a6b6a41f917d/extract-utilities/0.log" Feb 01 19:48:17 crc kubenswrapper[4961]: I0201 19:48:17.489248 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llcxm_94fdbb40-b51a-4305-8d95-a6b6a41f917d/extract-content/0.log" Feb 01 19:48:17 crc kubenswrapper[4961]: I0201 19:48:17.614831 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llcxm_94fdbb40-b51a-4305-8d95-a6b6a41f917d/registry-server/0.log" Feb 01 19:48:17 crc kubenswrapper[4961]: I0201 19:48:17.632125 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gx8gj_280af251-125c-4b96-a327-239a67f4c474/marketplace-operator/0.log" Feb 01 19:48:17 crc kubenswrapper[4961]: I0201 19:48:17.818843 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h2lhr_7ea3c801-5aaf-41f2-9823-31465d862851/extract-utilities/0.log" Feb 01 19:48:17 crc kubenswrapper[4961]: I0201 19:48:17.984307 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h2lhr_7ea3c801-5aaf-41f2-9823-31465d862851/extract-content/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.009533 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h2lhr_7ea3c801-5aaf-41f2-9823-31465d862851/extract-content/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.012673 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h2lhr_7ea3c801-5aaf-41f2-9823-31465d862851/extract-utilities/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.160112 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h2lhr_7ea3c801-5aaf-41f2-9823-31465d862851/extract-utilities/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.185973 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h2lhr_7ea3c801-5aaf-41f2-9823-31465d862851/extract-content/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.186177 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-h2lhr_7ea3c801-5aaf-41f2-9823-31465d862851/registry-server/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.328983 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-56xxs_f987a0d7-8d43-4ec5-a1dc-84d65534a69a/extract-utilities/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.507101 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-56xxs_f987a0d7-8d43-4ec5-a1dc-84d65534a69a/extract-content/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.507118 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-56xxs_f987a0d7-8d43-4ec5-a1dc-84d65534a69a/extract-utilities/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.545824 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-56xxs_f987a0d7-8d43-4ec5-a1dc-84d65534a69a/extract-content/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.656320 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-56xxs_f987a0d7-8d43-4ec5-a1dc-84d65534a69a/extract-utilities/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.671946 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-56xxs_f987a0d7-8d43-4ec5-a1dc-84d65534a69a/extract-content/0.log" Feb 01 19:48:18 crc kubenswrapper[4961]: I0201 19:48:18.828471 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-56xxs_f987a0d7-8d43-4ec5-a1dc-84d65534a69a/registry-server/0.log" Feb 01 19:48:50 crc kubenswrapper[4961]: I0201 19:48:50.281452 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:48:50 crc kubenswrapper[4961]: I0201 19:48:50.282992 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.391862 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-752mf"] Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.399447 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.404484 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-752mf"] Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.462953 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2xr8\" (UniqueName: \"kubernetes.io/projected/853f947f-2fa4-4700-af72-956095d4bf58-kube-api-access-p2xr8\") pod \"redhat-operators-752mf\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.463028 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-utilities\") pod \"redhat-operators-752mf\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.463120 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-catalog-content\") pod \"redhat-operators-752mf\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.563937 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2xr8\" (UniqueName: \"kubernetes.io/projected/853f947f-2fa4-4700-af72-956095d4bf58-kube-api-access-p2xr8\") pod \"redhat-operators-752mf\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.563994 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-utilities\") pod \"redhat-operators-752mf\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.564030 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-catalog-content\") pod \"redhat-operators-752mf\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.564536 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-catalog-content\") pod \"redhat-operators-752mf\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.564617 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-utilities\") pod \"redhat-operators-752mf\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.584931 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2xr8\" (UniqueName: \"kubernetes.io/projected/853f947f-2fa4-4700-af72-956095d4bf58-kube-api-access-p2xr8\") pod \"redhat-operators-752mf\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:15 crc kubenswrapper[4961]: I0201 19:49:15.732067 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:16 crc kubenswrapper[4961]: I0201 19:49:16.149977 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-752mf"] Feb 01 19:49:16 crc kubenswrapper[4961]: I0201 19:49:16.624011 4961 generic.go:334] "Generic (PLEG): container finished" podID="853f947f-2fa4-4700-af72-956095d4bf58" containerID="de560f66c8e19e2e98af67bb0f5355219a3c9736dbdb6e30e4ac5d71932387b2" exitCode=0 Feb 01 19:49:16 crc kubenswrapper[4961]: I0201 19:49:16.624388 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-752mf" event={"ID":"853f947f-2fa4-4700-af72-956095d4bf58","Type":"ContainerDied","Data":"de560f66c8e19e2e98af67bb0f5355219a3c9736dbdb6e30e4ac5d71932387b2"} Feb 01 19:49:16 crc kubenswrapper[4961]: I0201 19:49:16.624414 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-752mf" event={"ID":"853f947f-2fa4-4700-af72-956095d4bf58","Type":"ContainerStarted","Data":"d00af7e32f6f4ccb31391457088f2d870038a3faf8527518334f7d6d882655ef"} Feb 01 19:49:16 crc kubenswrapper[4961]: I0201 19:49:16.627203 4961 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 01 19:49:17 crc kubenswrapper[4961]: I0201 19:49:17.631731 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-752mf" event={"ID":"853f947f-2fa4-4700-af72-956095d4bf58","Type":"ContainerStarted","Data":"bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd"} Feb 01 19:49:18 crc kubenswrapper[4961]: I0201 19:49:18.640111 4961 generic.go:334] "Generic (PLEG): container finished" podID="853f947f-2fa4-4700-af72-956095d4bf58" containerID="bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd" exitCode=0 Feb 01 19:49:18 crc kubenswrapper[4961]: I0201 19:49:18.640174 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-752mf" event={"ID":"853f947f-2fa4-4700-af72-956095d4bf58","Type":"ContainerDied","Data":"bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd"} Feb 01 19:49:19 crc kubenswrapper[4961]: I0201 19:49:19.650079 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-752mf" event={"ID":"853f947f-2fa4-4700-af72-956095d4bf58","Type":"ContainerStarted","Data":"d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19"} Feb 01 19:49:19 crc kubenswrapper[4961]: I0201 19:49:19.670315 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-752mf" podStartSLOduration=2.252177041 podStartE2EDuration="4.670292524s" podCreationTimestamp="2026-02-01 19:49:15 +0000 UTC" firstStartedPulling="2026-02-01 19:49:16.626998395 +0000 UTC m=+911.152029818" lastFinishedPulling="2026-02-01 19:49:19.045113878 +0000 UTC m=+913.570145301" observedRunningTime="2026-02-01 19:49:19.666316878 +0000 UTC m=+914.191348321" watchObservedRunningTime="2026-02-01 19:49:19.670292524 +0000 UTC m=+914.195323957" Feb 01 19:49:20 crc kubenswrapper[4961]: I0201 19:49:20.282084 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:49:20 crc kubenswrapper[4961]: I0201 19:49:20.282163 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.206548 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4fnxx"] Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.210471 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.249836 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4fnxx"] Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.317675 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-catalog-content\") pod \"community-operators-4fnxx\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.317738 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2bb2\" (UniqueName: \"kubernetes.io/projected/5971c607-8450-4278-bb00-aeb486959875-kube-api-access-s2bb2\") pod \"community-operators-4fnxx\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.317765 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-utilities\") pod \"community-operators-4fnxx\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.419193 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-catalog-content\") pod \"community-operators-4fnxx\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.419890 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2bb2\" (UniqueName: \"kubernetes.io/projected/5971c607-8450-4278-bb00-aeb486959875-kube-api-access-s2bb2\") pod \"community-operators-4fnxx\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.420299 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-utilities\") pod \"community-operators-4fnxx\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.419857 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-catalog-content\") pod \"community-operators-4fnxx\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.420580 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-utilities\") pod \"community-operators-4fnxx\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.438395 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2bb2\" (UniqueName: \"kubernetes.io/projected/5971c607-8450-4278-bb00-aeb486959875-kube-api-access-s2bb2\") pod \"community-operators-4fnxx\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.560705 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:24 crc kubenswrapper[4961]: I0201 19:49:24.855886 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4fnxx"] Feb 01 19:49:25 crc kubenswrapper[4961]: I0201 19:49:25.696844 4961 generic.go:334] "Generic (PLEG): container finished" podID="bb6b29b7-02ac-4889-aef0-ba0cdcf9a210" containerID="0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6" exitCode=0 Feb 01 19:49:25 crc kubenswrapper[4961]: I0201 19:49:25.696876 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mh6jk/must-gather-9nbrv" event={"ID":"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210","Type":"ContainerDied","Data":"0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6"} Feb 01 19:49:25 crc kubenswrapper[4961]: I0201 19:49:25.697594 4961 scope.go:117] "RemoveContainer" containerID="0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6" Feb 01 19:49:25 crc kubenswrapper[4961]: I0201 19:49:25.700922 4961 generic.go:334] "Generic (PLEG): container finished" podID="5971c607-8450-4278-bb00-aeb486959875" containerID="4f6bf16e9728fab3007a2723dae22be9719f57e6adefb3f05adbd9100fe86253" exitCode=0 Feb 01 19:49:25 crc kubenswrapper[4961]: I0201 19:49:25.701026 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fnxx" event={"ID":"5971c607-8450-4278-bb00-aeb486959875","Type":"ContainerDied","Data":"4f6bf16e9728fab3007a2723dae22be9719f57e6adefb3f05adbd9100fe86253"} Feb 01 19:49:25 crc kubenswrapper[4961]: I0201 19:49:25.701161 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fnxx" event={"ID":"5971c607-8450-4278-bb00-aeb486959875","Type":"ContainerStarted","Data":"2a19219e8bddb7a106b0ff3c5dccc5baf492fe027b04c7df50da76d6928cb08e"} Feb 01 19:49:25 crc kubenswrapper[4961]: I0201 19:49:25.733124 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:25 crc kubenswrapper[4961]: I0201 19:49:25.733194 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:26 crc kubenswrapper[4961]: I0201 19:49:26.549790 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mh6jk_must-gather-9nbrv_bb6b29b7-02ac-4889-aef0-ba0cdcf9a210/gather/0.log" Feb 01 19:49:26 crc kubenswrapper[4961]: I0201 19:49:26.712156 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fnxx" event={"ID":"5971c607-8450-4278-bb00-aeb486959875","Type":"ContainerStarted","Data":"37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a"} Feb 01 19:49:26 crc kubenswrapper[4961]: I0201 19:49:26.789252 4961 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-752mf" podUID="853f947f-2fa4-4700-af72-956095d4bf58" containerName="registry-server" probeResult="failure" output=< Feb 01 19:49:26 crc kubenswrapper[4961]: timeout: failed to connect service ":50051" within 1s Feb 01 19:49:26 crc kubenswrapper[4961]: > Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.721347 4961 generic.go:334] "Generic (PLEG): container finished" podID="5971c607-8450-4278-bb00-aeb486959875" containerID="37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a" exitCode=0 Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.721392 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fnxx" event={"ID":"5971c607-8450-4278-bb00-aeb486959875","Type":"ContainerDied","Data":"37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a"} Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.786957 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sh5cf"] Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.788454 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.792523 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh5cf"] Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.874224 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-catalog-content\") pod \"redhat-marketplace-sh5cf\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.874355 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fczl2\" (UniqueName: \"kubernetes.io/projected/edef59e4-0157-44b6-80d8-c40f6c5abeca-kube-api-access-fczl2\") pod \"redhat-marketplace-sh5cf\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.874436 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-utilities\") pod \"redhat-marketplace-sh5cf\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.975458 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fczl2\" (UniqueName: \"kubernetes.io/projected/edef59e4-0157-44b6-80d8-c40f6c5abeca-kube-api-access-fczl2\") pod \"redhat-marketplace-sh5cf\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.975574 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-utilities\") pod \"redhat-marketplace-sh5cf\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.975610 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-catalog-content\") pod \"redhat-marketplace-sh5cf\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.976116 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-utilities\") pod \"redhat-marketplace-sh5cf\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:27 crc kubenswrapper[4961]: I0201 19:49:27.976197 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-catalog-content\") pod \"redhat-marketplace-sh5cf\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:28 crc kubenswrapper[4961]: I0201 19:49:28.006305 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fczl2\" (UniqueName: \"kubernetes.io/projected/edef59e4-0157-44b6-80d8-c40f6c5abeca-kube-api-access-fczl2\") pod \"redhat-marketplace-sh5cf\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:28 crc kubenswrapper[4961]: I0201 19:49:28.139534 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:28 crc kubenswrapper[4961]: I0201 19:49:28.418248 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh5cf"] Feb 01 19:49:28 crc kubenswrapper[4961]: W0201 19:49:28.425438 4961 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedef59e4_0157_44b6_80d8_c40f6c5abeca.slice/crio-5342d10b2ba76d07cd3d0e7fee7470e5af9543ffa6278f2e5771399e9bd20275 WatchSource:0}: Error finding container 5342d10b2ba76d07cd3d0e7fee7470e5af9543ffa6278f2e5771399e9bd20275: Status 404 returned error can't find the container with id 5342d10b2ba76d07cd3d0e7fee7470e5af9543ffa6278f2e5771399e9bd20275 Feb 01 19:49:28 crc kubenswrapper[4961]: I0201 19:49:28.730912 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fnxx" event={"ID":"5971c607-8450-4278-bb00-aeb486959875","Type":"ContainerStarted","Data":"fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3"} Feb 01 19:49:28 crc kubenswrapper[4961]: I0201 19:49:28.733695 4961 generic.go:334] "Generic (PLEG): container finished" podID="edef59e4-0157-44b6-80d8-c40f6c5abeca" containerID="70aef7459475bbde9cdfa5c060bdbda6a88cdfed193b20724cf9c164c84f1108" exitCode=0 Feb 01 19:49:28 crc kubenswrapper[4961]: I0201 19:49:28.733736 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh5cf" event={"ID":"edef59e4-0157-44b6-80d8-c40f6c5abeca","Type":"ContainerDied","Data":"70aef7459475bbde9cdfa5c060bdbda6a88cdfed193b20724cf9c164c84f1108"} Feb 01 19:49:28 crc kubenswrapper[4961]: I0201 19:49:28.733777 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh5cf" event={"ID":"edef59e4-0157-44b6-80d8-c40f6c5abeca","Type":"ContainerStarted","Data":"5342d10b2ba76d07cd3d0e7fee7470e5af9543ffa6278f2e5771399e9bd20275"} Feb 01 19:49:28 crc kubenswrapper[4961]: I0201 19:49:28.750059 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4fnxx" podStartSLOduration=2.297345818 podStartE2EDuration="4.750025772s" podCreationTimestamp="2026-02-01 19:49:24 +0000 UTC" firstStartedPulling="2026-02-01 19:49:25.702668696 +0000 UTC m=+920.227700119" lastFinishedPulling="2026-02-01 19:49:28.15534865 +0000 UTC m=+922.680380073" observedRunningTime="2026-02-01 19:49:28.748476531 +0000 UTC m=+923.273507954" watchObservedRunningTime="2026-02-01 19:49:28.750025772 +0000 UTC m=+923.275057195" Feb 01 19:49:29 crc kubenswrapper[4961]: I0201 19:49:29.742338 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh5cf" event={"ID":"edef59e4-0157-44b6-80d8-c40f6c5abeca","Type":"ContainerStarted","Data":"0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695"} Feb 01 19:49:30 crc kubenswrapper[4961]: I0201 19:49:30.753176 4961 generic.go:334] "Generic (PLEG): container finished" podID="edef59e4-0157-44b6-80d8-c40f6c5abeca" containerID="0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695" exitCode=0 Feb 01 19:49:30 crc kubenswrapper[4961]: I0201 19:49:30.753352 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh5cf" event={"ID":"edef59e4-0157-44b6-80d8-c40f6c5abeca","Type":"ContainerDied","Data":"0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695"} Feb 01 19:49:31 crc kubenswrapper[4961]: I0201 19:49:31.762188 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh5cf" event={"ID":"edef59e4-0157-44b6-80d8-c40f6c5abeca","Type":"ContainerStarted","Data":"8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79"} Feb 01 19:49:31 crc kubenswrapper[4961]: I0201 19:49:31.780747 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sh5cf" podStartSLOduration=2.351206528 podStartE2EDuration="4.780726765s" podCreationTimestamp="2026-02-01 19:49:27 +0000 UTC" firstStartedPulling="2026-02-01 19:49:28.735401202 +0000 UTC m=+923.260432625" lastFinishedPulling="2026-02-01 19:49:31.164921439 +0000 UTC m=+925.689952862" observedRunningTime="2026-02-01 19:49:31.775809215 +0000 UTC m=+926.300840638" watchObservedRunningTime="2026-02-01 19:49:31.780726765 +0000 UTC m=+926.305758188" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.579156 4961 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lzjfk"] Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.581761 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.610808 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lzjfk"] Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.648983 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad0135cf-161b-4d94-907d-b707cdcab6a0-catalog-content\") pod \"certified-operators-lzjfk\" (UID: \"ad0135cf-161b-4d94-907d-b707cdcab6a0\") " pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.649129 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjqg6\" (UniqueName: \"kubernetes.io/projected/ad0135cf-161b-4d94-907d-b707cdcab6a0-kube-api-access-pjqg6\") pod \"certified-operators-lzjfk\" (UID: \"ad0135cf-161b-4d94-907d-b707cdcab6a0\") " pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.649174 4961 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad0135cf-161b-4d94-907d-b707cdcab6a0-utilities\") pod \"certified-operators-lzjfk\" (UID: \"ad0135cf-161b-4d94-907d-b707cdcab6a0\") " pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.750904 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad0135cf-161b-4d94-907d-b707cdcab6a0-catalog-content\") pod \"certified-operators-lzjfk\" (UID: \"ad0135cf-161b-4d94-907d-b707cdcab6a0\") " pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.751485 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad0135cf-161b-4d94-907d-b707cdcab6a0-catalog-content\") pod \"certified-operators-lzjfk\" (UID: \"ad0135cf-161b-4d94-907d-b707cdcab6a0\") " pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.751602 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjqg6\" (UniqueName: \"kubernetes.io/projected/ad0135cf-161b-4d94-907d-b707cdcab6a0-kube-api-access-pjqg6\") pod \"certified-operators-lzjfk\" (UID: \"ad0135cf-161b-4d94-907d-b707cdcab6a0\") " pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.751650 4961 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad0135cf-161b-4d94-907d-b707cdcab6a0-utilities\") pod \"certified-operators-lzjfk\" (UID: \"ad0135cf-161b-4d94-907d-b707cdcab6a0\") " pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.752383 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad0135cf-161b-4d94-907d-b707cdcab6a0-utilities\") pod \"certified-operators-lzjfk\" (UID: \"ad0135cf-161b-4d94-907d-b707cdcab6a0\") " pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.781058 4961 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjqg6\" (UniqueName: \"kubernetes.io/projected/ad0135cf-161b-4d94-907d-b707cdcab6a0-kube-api-access-pjqg6\") pod \"certified-operators-lzjfk\" (UID: \"ad0135cf-161b-4d94-907d-b707cdcab6a0\") " pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.908413 4961 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.911899 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mh6jk/must-gather-9nbrv"] Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.912218 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mh6jk/must-gather-9nbrv" podUID="bb6b29b7-02ac-4889-aef0-ba0cdcf9a210" containerName="copy" containerID="cri-o://621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9" gracePeriod=2 Feb 01 19:49:33 crc kubenswrapper[4961]: I0201 19:49:33.919693 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mh6jk/must-gather-9nbrv"] Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.144246 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lzjfk"] Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.285018 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mh6jk_must-gather-9nbrv_bb6b29b7-02ac-4889-aef0-ba0cdcf9a210/copy/0.log" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.285531 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mh6jk/must-gather-9nbrv" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.357862 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qzzx\" (UniqueName: \"kubernetes.io/projected/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-kube-api-access-2qzzx\") pod \"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210\" (UID: \"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210\") " Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.357924 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-must-gather-output\") pod \"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210\" (UID: \"bb6b29b7-02ac-4889-aef0-ba0cdcf9a210\") " Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.364241 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-kube-api-access-2qzzx" (OuterVolumeSpecName: "kube-api-access-2qzzx") pod "bb6b29b7-02ac-4889-aef0-ba0cdcf9a210" (UID: "bb6b29b7-02ac-4889-aef0-ba0cdcf9a210"). InnerVolumeSpecName "kube-api-access-2qzzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.407735 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "bb6b29b7-02ac-4889-aef0-ba0cdcf9a210" (UID: "bb6b29b7-02ac-4889-aef0-ba0cdcf9a210"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.459572 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qzzx\" (UniqueName: \"kubernetes.io/projected/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-kube-api-access-2qzzx\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.459607 4961 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.560972 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.561016 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.602030 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.782100 4961 generic.go:334] "Generic (PLEG): container finished" podID="ad0135cf-161b-4d94-907d-b707cdcab6a0" containerID="db24655a5e8fe6ae3d2897845b8526c71819008734d371ad91d8fb28150fbc1c" exitCode=0 Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.782153 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzjfk" event={"ID":"ad0135cf-161b-4d94-907d-b707cdcab6a0","Type":"ContainerDied","Data":"db24655a5e8fe6ae3d2897845b8526c71819008734d371ad91d8fb28150fbc1c"} Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.782177 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzjfk" event={"ID":"ad0135cf-161b-4d94-907d-b707cdcab6a0","Type":"ContainerStarted","Data":"d9183fa8128236f71f854c9e01473504c1e607b29704248f89e80a2ae9164098"} Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.786067 4961 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mh6jk_must-gather-9nbrv_bb6b29b7-02ac-4889-aef0-ba0cdcf9a210/copy/0.log" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.786818 4961 generic.go:334] "Generic (PLEG): container finished" podID="bb6b29b7-02ac-4889-aef0-ba0cdcf9a210" containerID="621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9" exitCode=143 Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.786917 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mh6jk/must-gather-9nbrv" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.787001 4961 scope.go:117] "RemoveContainer" containerID="621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.845320 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.853923 4961 scope.go:117] "RemoveContainer" containerID="0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.889789 4961 scope.go:117] "RemoveContainer" containerID="621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9" Feb 01 19:49:34 crc kubenswrapper[4961]: E0201 19:49:34.890572 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9\": container with ID starting with 621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9 not found: ID does not exist" containerID="621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.890612 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9"} err="failed to get container status \"621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9\": rpc error: code = NotFound desc = could not find container \"621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9\": container with ID starting with 621b525afef6e6022012e8c99a18efc1eb78cfa17f0d35677e9d6dc36842fee9 not found: ID does not exist" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.890638 4961 scope.go:117] "RemoveContainer" containerID="0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6" Feb 01 19:49:34 crc kubenswrapper[4961]: E0201 19:49:34.891165 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6\": container with ID starting with 0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6 not found: ID does not exist" containerID="0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6" Feb 01 19:49:34 crc kubenswrapper[4961]: I0201 19:49:34.891191 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6"} err="failed to get container status \"0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6\": rpc error: code = NotFound desc = could not find container \"0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6\": container with ID starting with 0ae1e1cd9fac6291afeff88a822204deeea98f28271f53749a05f3bc646c82a6 not found: ID does not exist" Feb 01 19:49:35 crc kubenswrapper[4961]: I0201 19:49:35.795428 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:35 crc kubenswrapper[4961]: I0201 19:49:35.833359 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:36 crc kubenswrapper[4961]: I0201 19:49:36.246010 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6b29b7-02ac-4889-aef0-ba0cdcf9a210" path="/var/lib/kubelet/pods/bb6b29b7-02ac-4889-aef0-ba0cdcf9a210/volumes" Feb 01 19:49:37 crc kubenswrapper[4961]: I0201 19:49:37.962723 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4fnxx"] Feb 01 19:49:37 crc kubenswrapper[4961]: I0201 19:49:37.962957 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4fnxx" podUID="5971c607-8450-4278-bb00-aeb486959875" containerName="registry-server" containerID="cri-o://fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3" gracePeriod=2 Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.140564 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.142332 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.187547 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.508021 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.615488 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2bb2\" (UniqueName: \"kubernetes.io/projected/5971c607-8450-4278-bb00-aeb486959875-kube-api-access-s2bb2\") pod \"5971c607-8450-4278-bb00-aeb486959875\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.615581 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-utilities\") pod \"5971c607-8450-4278-bb00-aeb486959875\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.615652 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-catalog-content\") pod \"5971c607-8450-4278-bb00-aeb486959875\" (UID: \"5971c607-8450-4278-bb00-aeb486959875\") " Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.616562 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-utilities" (OuterVolumeSpecName: "utilities") pod "5971c607-8450-4278-bb00-aeb486959875" (UID: "5971c607-8450-4278-bb00-aeb486959875"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.625610 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5971c607-8450-4278-bb00-aeb486959875-kube-api-access-s2bb2" (OuterVolumeSpecName: "kube-api-access-s2bb2") pod "5971c607-8450-4278-bb00-aeb486959875" (UID: "5971c607-8450-4278-bb00-aeb486959875"). InnerVolumeSpecName "kube-api-access-s2bb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.664295 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5971c607-8450-4278-bb00-aeb486959875" (UID: "5971c607-8450-4278-bb00-aeb486959875"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.717128 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2bb2\" (UniqueName: \"kubernetes.io/projected/5971c607-8450-4278-bb00-aeb486959875-kube-api-access-s2bb2\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.717182 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.717196 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5971c607-8450-4278-bb00-aeb486959875-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.820989 4961 generic.go:334] "Generic (PLEG): container finished" podID="5971c607-8450-4278-bb00-aeb486959875" containerID="fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3" exitCode=0 Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.821061 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fnxx" event={"ID":"5971c607-8450-4278-bb00-aeb486959875","Type":"ContainerDied","Data":"fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3"} Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.821078 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4fnxx" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.821096 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4fnxx" event={"ID":"5971c607-8450-4278-bb00-aeb486959875","Type":"ContainerDied","Data":"2a19219e8bddb7a106b0ff3c5dccc5baf492fe027b04c7df50da76d6928cb08e"} Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.821121 4961 scope.go:117] "RemoveContainer" containerID="fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.835940 4961 generic.go:334] "Generic (PLEG): container finished" podID="ad0135cf-161b-4d94-907d-b707cdcab6a0" containerID="d9a3a9bbde0760dfb035e1e41e52bcc7023e1c6f3febaa574fe146df43092692" exitCode=0 Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.836011 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzjfk" event={"ID":"ad0135cf-161b-4d94-907d-b707cdcab6a0","Type":"ContainerDied","Data":"d9a3a9bbde0760dfb035e1e41e52bcc7023e1c6f3febaa574fe146df43092692"} Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.861326 4961 scope.go:117] "RemoveContainer" containerID="37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.881641 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4fnxx"] Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.885677 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4fnxx"] Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.886944 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.899526 4961 scope.go:117] "RemoveContainer" containerID="4f6bf16e9728fab3007a2723dae22be9719f57e6adefb3f05adbd9100fe86253" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.916336 4961 scope.go:117] "RemoveContainer" containerID="fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3" Feb 01 19:49:38 crc kubenswrapper[4961]: E0201 19:49:38.916848 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3\": container with ID starting with fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3 not found: ID does not exist" containerID="fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.916900 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3"} err="failed to get container status \"fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3\": rpc error: code = NotFound desc = could not find container \"fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3\": container with ID starting with fa45d6ec220d73fbc5b8c80879571cc6ac56d9f2598c861e3b8fe88d99a565d3 not found: ID does not exist" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.916930 4961 scope.go:117] "RemoveContainer" containerID="37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a" Feb 01 19:49:38 crc kubenswrapper[4961]: E0201 19:49:38.917323 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a\": container with ID starting with 37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a not found: ID does not exist" containerID="37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.917369 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a"} err="failed to get container status \"37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a\": rpc error: code = NotFound desc = could not find container \"37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a\": container with ID starting with 37c3881c4b8b6608b6196eba8ea3fc9e11c7b7b8dec9572958f2b52c3b24f74a not found: ID does not exist" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.917396 4961 scope.go:117] "RemoveContainer" containerID="4f6bf16e9728fab3007a2723dae22be9719f57e6adefb3f05adbd9100fe86253" Feb 01 19:49:38 crc kubenswrapper[4961]: E0201 19:49:38.917868 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f6bf16e9728fab3007a2723dae22be9719f57e6adefb3f05adbd9100fe86253\": container with ID starting with 4f6bf16e9728fab3007a2723dae22be9719f57e6adefb3f05adbd9100fe86253 not found: ID does not exist" containerID="4f6bf16e9728fab3007a2723dae22be9719f57e6adefb3f05adbd9100fe86253" Feb 01 19:49:38 crc kubenswrapper[4961]: I0201 19:49:38.917907 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f6bf16e9728fab3007a2723dae22be9719f57e6adefb3f05adbd9100fe86253"} err="failed to get container status \"4f6bf16e9728fab3007a2723dae22be9719f57e6adefb3f05adbd9100fe86253\": rpc error: code = NotFound desc = could not find container \"4f6bf16e9728fab3007a2723dae22be9719f57e6adefb3f05adbd9100fe86253\": container with ID starting with 4f6bf16e9728fab3007a2723dae22be9719f57e6adefb3f05adbd9100fe86253 not found: ID does not exist" Feb 01 19:49:39 crc kubenswrapper[4961]: I0201 19:49:39.843589 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lzjfk" event={"ID":"ad0135cf-161b-4d94-907d-b707cdcab6a0","Type":"ContainerStarted","Data":"fed7d4997c663b6ee794e802244fa67cc5ed6508639723d8e603f34f5bf955e9"} Feb 01 19:49:39 crc kubenswrapper[4961]: I0201 19:49:39.863434 4961 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lzjfk" podStartSLOduration=2.363299468 podStartE2EDuration="6.863421198s" podCreationTimestamp="2026-02-01 19:49:33 +0000 UTC" firstStartedPulling="2026-02-01 19:49:34.783711208 +0000 UTC m=+929.308742631" lastFinishedPulling="2026-02-01 19:49:39.283832898 +0000 UTC m=+933.808864361" observedRunningTime="2026-02-01 19:49:39.85939923 +0000 UTC m=+934.384430653" watchObservedRunningTime="2026-02-01 19:49:39.863421198 +0000 UTC m=+934.388452621" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.160769 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-752mf"] Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.160998 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-752mf" podUID="853f947f-2fa4-4700-af72-956095d4bf58" containerName="registry-server" containerID="cri-o://d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19" gracePeriod=2 Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.243365 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5971c607-8450-4278-bb00-aeb486959875" path="/var/lib/kubelet/pods/5971c607-8450-4278-bb00-aeb486959875/volumes" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.530353 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.660595 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-utilities\") pod \"853f947f-2fa4-4700-af72-956095d4bf58\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.660671 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-catalog-content\") pod \"853f947f-2fa4-4700-af72-956095d4bf58\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.660767 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2xr8\" (UniqueName: \"kubernetes.io/projected/853f947f-2fa4-4700-af72-956095d4bf58-kube-api-access-p2xr8\") pod \"853f947f-2fa4-4700-af72-956095d4bf58\" (UID: \"853f947f-2fa4-4700-af72-956095d4bf58\") " Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.662299 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-utilities" (OuterVolumeSpecName: "utilities") pod "853f947f-2fa4-4700-af72-956095d4bf58" (UID: "853f947f-2fa4-4700-af72-956095d4bf58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.669206 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/853f947f-2fa4-4700-af72-956095d4bf58-kube-api-access-p2xr8" (OuterVolumeSpecName: "kube-api-access-p2xr8") pod "853f947f-2fa4-4700-af72-956095d4bf58" (UID: "853f947f-2fa4-4700-af72-956095d4bf58"). InnerVolumeSpecName "kube-api-access-p2xr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.762325 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2xr8\" (UniqueName: \"kubernetes.io/projected/853f947f-2fa4-4700-af72-956095d4bf58-kube-api-access-p2xr8\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.762359 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.780618 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "853f947f-2fa4-4700-af72-956095d4bf58" (UID: "853f947f-2fa4-4700-af72-956095d4bf58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.853283 4961 generic.go:334] "Generic (PLEG): container finished" podID="853f947f-2fa4-4700-af72-956095d4bf58" containerID="d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19" exitCode=0 Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.853357 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-752mf" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.853394 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-752mf" event={"ID":"853f947f-2fa4-4700-af72-956095d4bf58","Type":"ContainerDied","Data":"d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19"} Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.853452 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-752mf" event={"ID":"853f947f-2fa4-4700-af72-956095d4bf58","Type":"ContainerDied","Data":"d00af7e32f6f4ccb31391457088f2d870038a3faf8527518334f7d6d882655ef"} Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.853479 4961 scope.go:117] "RemoveContainer" containerID="d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.863629 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/853f947f-2fa4-4700-af72-956095d4bf58-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.874254 4961 scope.go:117] "RemoveContainer" containerID="bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.886740 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-752mf"] Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.893118 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-752mf"] Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.917233 4961 scope.go:117] "RemoveContainer" containerID="de560f66c8e19e2e98af67bb0f5355219a3c9736dbdb6e30e4ac5d71932387b2" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.935354 4961 scope.go:117] "RemoveContainer" containerID="d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19" Feb 01 19:49:40 crc kubenswrapper[4961]: E0201 19:49:40.936594 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19\": container with ID starting with d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19 not found: ID does not exist" containerID="d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.936627 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19"} err="failed to get container status \"d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19\": rpc error: code = NotFound desc = could not find container \"d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19\": container with ID starting with d1b82d5964a3e6da2b1d0f68dac34ac0bbeda940e8769eb786995c32c7498d19 not found: ID does not exist" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.936649 4961 scope.go:117] "RemoveContainer" containerID="bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd" Feb 01 19:49:40 crc kubenswrapper[4961]: E0201 19:49:40.937135 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd\": container with ID starting with bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd not found: ID does not exist" containerID="bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.937167 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd"} err="failed to get container status \"bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd\": rpc error: code = NotFound desc = could not find container \"bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd\": container with ID starting with bb5c2235e186833490f5fe03db4f8b6f4b5427e1e9eb2129bc23ade8534d24bd not found: ID does not exist" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.937188 4961 scope.go:117] "RemoveContainer" containerID="de560f66c8e19e2e98af67bb0f5355219a3c9736dbdb6e30e4ac5d71932387b2" Feb 01 19:49:40 crc kubenswrapper[4961]: E0201 19:49:40.937548 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de560f66c8e19e2e98af67bb0f5355219a3c9736dbdb6e30e4ac5d71932387b2\": container with ID starting with de560f66c8e19e2e98af67bb0f5355219a3c9736dbdb6e30e4ac5d71932387b2 not found: ID does not exist" containerID="de560f66c8e19e2e98af67bb0f5355219a3c9736dbdb6e30e4ac5d71932387b2" Feb 01 19:49:40 crc kubenswrapper[4961]: I0201 19:49:40.937571 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de560f66c8e19e2e98af67bb0f5355219a3c9736dbdb6e30e4ac5d71932387b2"} err="failed to get container status \"de560f66c8e19e2e98af67bb0f5355219a3c9736dbdb6e30e4ac5d71932387b2\": rpc error: code = NotFound desc = could not find container \"de560f66c8e19e2e98af67bb0f5355219a3c9736dbdb6e30e4ac5d71932387b2\": container with ID starting with de560f66c8e19e2e98af67bb0f5355219a3c9736dbdb6e30e4ac5d71932387b2 not found: ID does not exist" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.250710 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="853f947f-2fa4-4700-af72-956095d4bf58" path="/var/lib/kubelet/pods/853f947f-2fa4-4700-af72-956095d4bf58/volumes" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.366950 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh5cf"] Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.367400 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sh5cf" podUID="edef59e4-0157-44b6-80d8-c40f6c5abeca" containerName="registry-server" containerID="cri-o://8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79" gracePeriod=2 Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.777351 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.881698 4961 generic.go:334] "Generic (PLEG): container finished" podID="edef59e4-0157-44b6-80d8-c40f6c5abeca" containerID="8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79" exitCode=0 Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.881746 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh5cf" event={"ID":"edef59e4-0157-44b6-80d8-c40f6c5abeca","Type":"ContainerDied","Data":"8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79"} Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.881772 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh5cf" event={"ID":"edef59e4-0157-44b6-80d8-c40f6c5abeca","Type":"ContainerDied","Data":"5342d10b2ba76d07cd3d0e7fee7470e5af9543ffa6278f2e5771399e9bd20275"} Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.881789 4961 scope.go:117] "RemoveContainer" containerID="8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.881930 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh5cf" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.905467 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fczl2\" (UniqueName: \"kubernetes.io/projected/edef59e4-0157-44b6-80d8-c40f6c5abeca-kube-api-access-fczl2\") pod \"edef59e4-0157-44b6-80d8-c40f6c5abeca\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.915771 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-catalog-content\") pod \"edef59e4-0157-44b6-80d8-c40f6c5abeca\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.917637 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-utilities\") pod \"edef59e4-0157-44b6-80d8-c40f6c5abeca\" (UID: \"edef59e4-0157-44b6-80d8-c40f6c5abeca\") " Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.918710 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-utilities" (OuterVolumeSpecName: "utilities") pod "edef59e4-0157-44b6-80d8-c40f6c5abeca" (UID: "edef59e4-0157-44b6-80d8-c40f6c5abeca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.935222 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edef59e4-0157-44b6-80d8-c40f6c5abeca-kube-api-access-fczl2" (OuterVolumeSpecName: "kube-api-access-fczl2") pod "edef59e4-0157-44b6-80d8-c40f6c5abeca" (UID: "edef59e4-0157-44b6-80d8-c40f6c5abeca"). InnerVolumeSpecName "kube-api-access-fczl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.935864 4961 scope.go:117] "RemoveContainer" containerID="0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.951853 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edef59e4-0157-44b6-80d8-c40f6c5abeca" (UID: "edef59e4-0157-44b6-80d8-c40f6c5abeca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.962817 4961 scope.go:117] "RemoveContainer" containerID="70aef7459475bbde9cdfa5c060bdbda6a88cdfed193b20724cf9c164c84f1108" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.979576 4961 scope.go:117] "RemoveContainer" containerID="8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79" Feb 01 19:49:42 crc kubenswrapper[4961]: E0201 19:49:42.980497 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79\": container with ID starting with 8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79 not found: ID does not exist" containerID="8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.980554 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79"} err="failed to get container status \"8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79\": rpc error: code = NotFound desc = could not find container \"8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79\": container with ID starting with 8483c954c5ff887b93f91da6702b64563d8142b1c58a2657901baea8b96c5b79 not found: ID does not exist" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.980595 4961 scope.go:117] "RemoveContainer" containerID="0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695" Feb 01 19:49:42 crc kubenswrapper[4961]: E0201 19:49:42.981110 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695\": container with ID starting with 0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695 not found: ID does not exist" containerID="0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.981144 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695"} err="failed to get container status \"0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695\": rpc error: code = NotFound desc = could not find container \"0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695\": container with ID starting with 0b184df7707afff0bf838de277091d69e68393f2b7dbfb2817d1233ca5d82695 not found: ID does not exist" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.981161 4961 scope.go:117] "RemoveContainer" containerID="70aef7459475bbde9cdfa5c060bdbda6a88cdfed193b20724cf9c164c84f1108" Feb 01 19:49:42 crc kubenswrapper[4961]: E0201 19:49:42.981899 4961 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70aef7459475bbde9cdfa5c060bdbda6a88cdfed193b20724cf9c164c84f1108\": container with ID starting with 70aef7459475bbde9cdfa5c060bdbda6a88cdfed193b20724cf9c164c84f1108 not found: ID does not exist" containerID="70aef7459475bbde9cdfa5c060bdbda6a88cdfed193b20724cf9c164c84f1108" Feb 01 19:49:42 crc kubenswrapper[4961]: I0201 19:49:42.981978 4961 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70aef7459475bbde9cdfa5c060bdbda6a88cdfed193b20724cf9c164c84f1108"} err="failed to get container status \"70aef7459475bbde9cdfa5c060bdbda6a88cdfed193b20724cf9c164c84f1108\": rpc error: code = NotFound desc = could not find container \"70aef7459475bbde9cdfa5c060bdbda6a88cdfed193b20724cf9c164c84f1108\": container with ID starting with 70aef7459475bbde9cdfa5c060bdbda6a88cdfed193b20724cf9c164c84f1108 not found: ID does not exist" Feb 01 19:49:43 crc kubenswrapper[4961]: I0201 19:49:43.020123 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fczl2\" (UniqueName: \"kubernetes.io/projected/edef59e4-0157-44b6-80d8-c40f6c5abeca-kube-api-access-fczl2\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:43 crc kubenswrapper[4961]: I0201 19:49:43.020156 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:43 crc kubenswrapper[4961]: I0201 19:49:43.020169 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef59e4-0157-44b6-80d8-c40f6c5abeca-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:43 crc kubenswrapper[4961]: I0201 19:49:43.219560 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh5cf"] Feb 01 19:49:43 crc kubenswrapper[4961]: I0201 19:49:43.224121 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh5cf"] Feb 01 19:49:43 crc kubenswrapper[4961]: I0201 19:49:43.909351 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:43 crc kubenswrapper[4961]: I0201 19:49:43.910985 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:43 crc kubenswrapper[4961]: I0201 19:49:43.947636 4961 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:44 crc kubenswrapper[4961]: I0201 19:49:44.244642 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edef59e4-0157-44b6-80d8-c40f6c5abeca" path="/var/lib/kubelet/pods/edef59e4-0157-44b6-80d8-c40f6c5abeca/volumes" Feb 01 19:49:44 crc kubenswrapper[4961]: I0201 19:49:44.941909 4961 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lzjfk" Feb 01 19:49:46 crc kubenswrapper[4961]: I0201 19:49:46.401807 4961 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lzjfk"] Feb 01 19:49:46 crc kubenswrapper[4961]: I0201 19:49:46.771321 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8xfhh"] Feb 01 19:49:46 crc kubenswrapper[4961]: I0201 19:49:46.772064 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8xfhh" podUID="e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52" containerName="registry-server" containerID="cri-o://a20ccecafb6394299d55482bcdd64a73c9b9e7f82d4c3d27707c9b11371b7e4d" gracePeriod=2 Feb 01 19:49:46 crc kubenswrapper[4961]: I0201 19:49:46.908141 4961 generic.go:334] "Generic (PLEG): container finished" podID="e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52" containerID="a20ccecafb6394299d55482bcdd64a73c9b9e7f82d4c3d27707c9b11371b7e4d" exitCode=0 Feb 01 19:49:46 crc kubenswrapper[4961]: I0201 19:49:46.908238 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xfhh" event={"ID":"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52","Type":"ContainerDied","Data":"a20ccecafb6394299d55482bcdd64a73c9b9e7f82d4c3d27707c9b11371b7e4d"} Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.189720 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.338445 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-utilities\") pod \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.338546 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p4rz\" (UniqueName: \"kubernetes.io/projected/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-kube-api-access-9p4rz\") pod \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.338604 4961 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-catalog-content\") pod \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\" (UID: \"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52\") " Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.339355 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-utilities" (OuterVolumeSpecName: "utilities") pod "e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52" (UID: "e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.340756 4961 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-utilities\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.359743 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-kube-api-access-9p4rz" (OuterVolumeSpecName: "kube-api-access-9p4rz") pod "e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52" (UID: "e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52"). InnerVolumeSpecName "kube-api-access-9p4rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.401465 4961 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52" (UID: "e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.442262 4961 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p4rz\" (UniqueName: \"kubernetes.io/projected/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-kube-api-access-9p4rz\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.442307 4961 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.919921 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xfhh" event={"ID":"e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52","Type":"ContainerDied","Data":"3446b1be5d339b0a53b9106ba8be8d3c70641deae02b2db5c4e4b02487adb5da"} Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.920008 4961 scope.go:117] "RemoveContainer" containerID="a20ccecafb6394299d55482bcdd64a73c9b9e7f82d4c3d27707c9b11371b7e4d" Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.920088 4961 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xfhh" Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.951495 4961 scope.go:117] "RemoveContainer" containerID="bb478315b62e8809827141bd6800aae54ac451e96320a864745034329c79903a" Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.970547 4961 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8xfhh"] Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.978869 4961 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8xfhh"] Feb 01 19:49:47 crc kubenswrapper[4961]: I0201 19:49:47.994235 4961 scope.go:117] "RemoveContainer" containerID="5c5eb6666c02f8dd488105c4e2eab6865657b419e4520ada5d73b1bcb53372b3" Feb 01 19:49:48 crc kubenswrapper[4961]: I0201 19:49:48.245612 4961 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52" path="/var/lib/kubelet/pods/e18b1c2a-910a-4e83-bcd3-6b5cc87b6d52/volumes" Feb 01 19:49:50 crc kubenswrapper[4961]: I0201 19:49:50.281466 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:49:50 crc kubenswrapper[4961]: I0201 19:49:50.281933 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:49:50 crc kubenswrapper[4961]: I0201 19:49:50.281995 4961 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" Feb 01 19:49:50 crc kubenswrapper[4961]: I0201 19:49:50.283115 4961 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c9386281dc2f3b973aa5b52c2ba8dc6727d168fe4f4434cf40a6f362c6e0b2b8"} pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 01 19:49:50 crc kubenswrapper[4961]: I0201 19:49:50.283204 4961 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" containerID="cri-o://c9386281dc2f3b973aa5b52c2ba8dc6727d168fe4f4434cf40a6f362c6e0b2b8" gracePeriod=600 Feb 01 19:49:50 crc kubenswrapper[4961]: I0201 19:49:50.951420 4961 generic.go:334] "Generic (PLEG): container finished" podID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerID="c9386281dc2f3b973aa5b52c2ba8dc6727d168fe4f4434cf40a6f362c6e0b2b8" exitCode=0 Feb 01 19:49:50 crc kubenswrapper[4961]: I0201 19:49:50.951763 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerDied","Data":"c9386281dc2f3b973aa5b52c2ba8dc6727d168fe4f4434cf40a6f362c6e0b2b8"} Feb 01 19:49:50 crc kubenswrapper[4961]: I0201 19:49:50.951808 4961 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" event={"ID":"0f613a17-3bb1-43f0-8552-0e55a2018f4c","Type":"ContainerStarted","Data":"567d7c09c4131986c668cc291803a83b17ad233ff4681e674d26365ee1b57e98"} Feb 01 19:49:50 crc kubenswrapper[4961]: I0201 19:49:50.951838 4961 scope.go:117] "RemoveContainer" containerID="e56e6851aa710a0556f0bc0d0d6be33becd254ce5bb21ee7d558efdcdd9f165c" Feb 01 19:51:50 crc kubenswrapper[4961]: I0201 19:51:50.281674 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:51:50 crc kubenswrapper[4961]: I0201 19:51:50.283135 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 01 19:52:20 crc kubenswrapper[4961]: I0201 19:52:20.281886 4961 patch_prober.go:28] interesting pod/machine-config-daemon-fgjhv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 01 19:52:20 crc kubenswrapper[4961]: I0201 19:52:20.283281 4961 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fgjhv" podUID="0f613a17-3bb1-43f0-8552-0e55a2018f4c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515137727402024455 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015137727403017373 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015137724761016522 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015137724761015472 5ustar corecore